concurrency-fix #15
403
WEBHOOK_REFACTORING_SUMMARY.md
Normal file
403
WEBHOOK_REFACTORING_SUMMARY.md
Normal file
@@ -0,0 +1,403 @@
|
|||||||
|
# Webhook System Refactoring - Implementation Summary
|
||||||
|
|
||||||
|
## Overview
|
||||||
|
This document summarizes the webhook system refactoring that was implemented to solve race conditions, unify webhook handling, add security through randomized URLs, and migrate hotel configuration to the database.
|
||||||
|
|
||||||
|
## What Was Implemented
|
||||||
|
|
||||||
|
### 1. Database Models ✅
|
||||||
|
**File:** [src/alpine_bits_python/db.py](src/alpine_bits_python/db.py)
|
||||||
|
|
||||||
|
Added three new database models:
|
||||||
|
|
||||||
|
#### Hotel Model
|
||||||
|
- Stores hotel configuration (previously in `alpine_bits_auth` config.yaml section)
|
||||||
|
- Fields: hotel_id, hotel_name, username, password_hash (bcrypt), meta/google account IDs, push endpoint config
|
||||||
|
- Relationships: one-to-many with webhook_endpoints
|
||||||
|
|
||||||
|
#### WebhookEndpoint Model
|
||||||
|
- Stores webhook configurations per hotel
|
||||||
|
- Each hotel can have multiple webhook types (wix_form, generic, etc.)
|
||||||
|
- Each endpoint has a unique randomized webhook_secret (64-char URL-safe string)
|
||||||
|
- Fields: webhook_secret, webhook_type, hotel_id, description, is_enabled
|
||||||
|
|
||||||
|
#### WebhookRequest Model
|
||||||
|
- Tracks incoming webhooks for deduplication and retry handling
|
||||||
|
- Uses SHA256 payload hashing to detect duplicates
|
||||||
|
- Status tracking: pending → processing → completed/failed
|
||||||
|
- Supports payload purging after retention period
|
||||||
|
- Fields: payload_hash, status, payload_json, retry_count, created_at, processing timestamps
|
||||||
|
|
||||||
|
### 2. Alembic Migration ✅
|
||||||
|
**File:** [alembic/versions/2025_11_25_1155-e7ee03d8f430_add_hotels_and_webhook_tables.py](alembic/versions/2025_11_25_1155-e7ee03d8f430_add_hotels_and_webhook_tables.py)
|
||||||
|
|
||||||
|
- Creates all three tables with appropriate indexes
|
||||||
|
- Includes composite indexes for query performance
|
||||||
|
- Fully reversible (downgrade supported)
|
||||||
|
|
||||||
|
### 3. Hotel Service ✅
|
||||||
|
**File:** [src/alpine_bits_python/hotel_service.py](src/alpine_bits_python/hotel_service.py)
|
||||||
|
|
||||||
|
**Key Functions:**
|
||||||
|
- `hash_password()` - Bcrypt password hashing (12 rounds)
|
||||||
|
- `verify_password()` - Bcrypt password verification
|
||||||
|
- `generate_webhook_secret()` - Cryptographically secure secret generation
|
||||||
|
- `sync_config_to_database()` - Syncs config.yaml to database at startup
|
||||||
|
- Creates/updates hotels from alpine_bits_auth config
|
||||||
|
- Auto-generates default webhook endpoints if missing
|
||||||
|
- Idempotent - safe to run on every startup
|
||||||
|
|
||||||
|
**HotelService Class:**
|
||||||
|
- `get_hotel_by_id()` - Look up hotel by hotel_id
|
||||||
|
- `get_hotel_by_webhook_secret()` - Look up hotel and endpoint by webhook secret
|
||||||
|
- `get_hotel_by_username()` - Look up hotel by AlpineBits username
|
||||||
|
|
||||||
|
### 4. Webhook Processor Interface ✅
|
||||||
|
**File:** [src/alpine_bits_python/webhook_processor.py](src/alpine_bits_python/webhook_processor.py)
|
||||||
|
|
||||||
|
**Architecture:**
|
||||||
|
- Protocol-based interface for webhook processors
|
||||||
|
- Registry pattern for managing processor types
|
||||||
|
- Two built-in processors:
|
||||||
|
- `WixFormProcessor` - Wraps existing `process_wix_form_submission()`
|
||||||
|
- `GenericWebhookProcessor` - Wraps existing `process_generic_webhook_submission()`
|
||||||
|
|
||||||
|
**Benefits:**
|
||||||
|
- Easy to add new webhook types
|
||||||
|
- Clean separation of concerns
|
||||||
|
- Type-safe processor interface
|
||||||
|
|
||||||
|
### 5. Config-to-Database Sync ✅
|
||||||
|
**File:** [src/alpine_bits_python/db_setup.py](src/alpine_bits_python/db_setup.py)
|
||||||
|
|
||||||
|
- Added call to `sync_config_to_database()` in `run_startup_tasks()`
|
||||||
|
- Runs on every application startup (primary worker only)
|
||||||
|
- Logs statistics about created/updated hotels and endpoints
|
||||||
|
|
||||||
|
### 6. Unified Webhook Handler ✅
|
||||||
|
**File:** [src/alpine_bits_python/api.py](src/alpine_bits_python/api.py)
|
||||||
|
|
||||||
|
**Endpoint:** `POST /api/webhook/{webhook_secret}`
|
||||||
|
|
||||||
|
**Flow:**
|
||||||
|
1. Look up webhook_endpoint by webhook_secret
|
||||||
|
2. Parse and hash payload (SHA256)
|
||||||
|
3. Check for duplicate using `SELECT FOR UPDATE SKIP LOCKED`
|
||||||
|
4. Return immediately if already processed (idempotent)
|
||||||
|
5. Create WebhookRequest with status='processing'
|
||||||
|
6. Route to appropriate processor based on webhook_type
|
||||||
|
7. Update status to 'completed' or 'failed'
|
||||||
|
8. Return response with webhook_id
|
||||||
|
|
||||||
|
**Race Condition Prevention:**
|
||||||
|
- PostgreSQL row-level locking with `SKIP LOCKED`
|
||||||
|
- Atomic status transitions
|
||||||
|
- Payload hash uniqueness constraint
|
||||||
|
- If duplicate detected during processing, return success (not error)
|
||||||
|
|
||||||
|
**Features:**
|
||||||
|
- Gzip decompression support
|
||||||
|
- Payload size limit (10MB)
|
||||||
|
- Automatic retry for failed webhooks
|
||||||
|
- Detailed error logging
|
||||||
|
- Source IP and user agent tracking
|
||||||
|
|
||||||
|
### 7. Cleanup and Monitoring ✅
|
||||||
|
**File:** [src/alpine_bits_python/api.py](src/alpine_bits_python/api.py)
|
||||||
|
|
||||||
|
**Functions:**
|
||||||
|
- `cleanup_stale_webhooks()` - Reset webhooks stuck in 'processing' (worker crash recovery)
|
||||||
|
- `purge_old_webhook_payloads()` - Remove payload_json from old completed webhooks (keeps metadata)
|
||||||
|
- `periodic_webhook_cleanup()` - Runs both cleanup tasks
|
||||||
|
|
||||||
|
**Scheduling:**
|
||||||
|
- Periodic task runs every 5 minutes (primary worker only)
|
||||||
|
- Stale timeout: 10 minutes
|
||||||
|
- Payload retention: 7 days before purge
|
||||||
|
|
||||||
|
### 8. Processor Initialization ✅
|
||||||
|
**File:** [src/alpine_bits_python/api.py](src/alpine_bits_python/api.py) - lifespan function
|
||||||
|
|
||||||
|
- Calls `initialize_webhook_processors()` during application startup
|
||||||
|
- Registers all built-in processors (wix_form, generic)
|
||||||
|
|
||||||
|
## What Was NOT Implemented (Future Work)
|
||||||
|
|
||||||
|
### 1. Legacy Endpoint Updates
|
||||||
|
The existing `/api/webhook/wix-form` and `/api/webhook/generic` endpoints still work as before. They could be updated to:
|
||||||
|
- Look up hotel from database
|
||||||
|
- Find appropriate webhook endpoint
|
||||||
|
- Redirect to unified handler
|
||||||
|
|
||||||
|
This is backward compatible, so it's not urgent.
|
||||||
|
|
||||||
|
### 2. AlpineBits Authentication Updates
|
||||||
|
The `validate_basic_auth()` function still reads from config.yaml. It could be updated to:
|
||||||
|
- Query hotels table by username
|
||||||
|
- Use bcrypt to verify password
|
||||||
|
- Return Hotel object instead of just credentials
|
||||||
|
|
||||||
|
This requires changing the AlpineBits auth flow, so it's a separate task.
|
||||||
|
|
||||||
|
### 3. Admin Endpoints
|
||||||
|
Could add endpoints for:
|
||||||
|
- `GET /admin/webhooks/stats` - Processing statistics
|
||||||
|
- `GET /admin/webhooks/failed` - Recent failures
|
||||||
|
- `POST /admin/webhooks/{id}/retry` - Manually retry failed webhook
|
||||||
|
- `GET /admin/hotels` - List all hotels with webhook URLs
|
||||||
|
- `POST /admin/hotels/{id}/webhook` - Create new webhook endpoint
|
||||||
|
|
||||||
|
### 4. Tests
|
||||||
|
Need to write tests for:
|
||||||
|
- Hotel service functions
|
||||||
|
- Webhook processors
|
||||||
|
- Unified webhook handler
|
||||||
|
- Race condition scenarios (concurrent identical webhooks)
|
||||||
|
- Deduplication logic
|
||||||
|
- Cleanup functions
|
||||||
|
|
||||||
|
## How to Use
|
||||||
|
|
||||||
|
### 1. Run Migration
|
||||||
|
```bash
|
||||||
|
uv run alembic upgrade head
|
||||||
|
```
|
||||||
|
|
||||||
|
### 2. Start Application
|
||||||
|
The application will automatically:
|
||||||
|
- Sync config.yaml hotels to database
|
||||||
|
- Generate default webhook endpoints for each hotel
|
||||||
|
- Log webhook URLs to console
|
||||||
|
- Start periodic cleanup tasks
|
||||||
|
|
||||||
|
### 3. Use New Webhook URLs
|
||||||
|
Each hotel will have webhook URLs like:
|
||||||
|
```
|
||||||
|
POST /api/webhook/{webhook_secret}
|
||||||
|
```
|
||||||
|
|
||||||
|
The webhook_secret is logged at startup, or you can query the database:
|
||||||
|
```sql
|
||||||
|
SELECT h.hotel_id, h.hotel_name, we.webhook_type, we.webhook_secret
|
||||||
|
FROM hotels h
|
||||||
|
JOIN webhook_endpoints we ON h.hotel_id = we.hotel_id
|
||||||
|
WHERE we.is_enabled = true;
|
||||||
|
```
|
||||||
|
|
||||||
|
Example webhook URL:
|
||||||
|
```
|
||||||
|
https://your-domain.com/api/webhook/x7K9mPq2rYv8sN4jZwL6tH1fBd3gCa5eFhIk0uMoQp-RnVxWy
|
||||||
|
```
|
||||||
|
|
||||||
|
### 4. Legacy Endpoints Still Work
|
||||||
|
Existing integrations using `/api/webhook/wix-form` or `/api/webhook/generic` will continue to work without changes.
|
||||||
|
|
||||||
|
## Benefits Achieved
|
||||||
|
|
||||||
|
### 1. Race Condition Prevention ✅
|
||||||
|
- PostgreSQL row-level locking prevents duplicate processing
|
||||||
|
- Atomic status transitions ensure only one worker processes each webhook
|
||||||
|
- Stale webhook cleanup recovers from worker crashes
|
||||||
|
|
||||||
|
### 2. Unified Webhook Handling ✅
|
||||||
|
- Single entry point with pluggable processor interface
|
||||||
|
- Easy to add new webhook types
|
||||||
|
- Consistent error handling and logging
|
||||||
|
|
||||||
|
### 3. Secure Webhook URLs ✅
|
||||||
|
- Randomized 64-character URL-safe secrets
|
||||||
|
- One unique secret per hotel/webhook-type combination
|
||||||
|
- No authentication needed (secret provides security)
|
||||||
|
|
||||||
|
### 4. Database-Backed Configuration ✅
|
||||||
|
- Hotel config automatically synced from config.yaml
|
||||||
|
- Passwords hashed with bcrypt
|
||||||
|
- Webhook endpoints stored in database
|
||||||
|
- Easy to manage via SQL queries
|
||||||
|
|
||||||
|
### 5. Payload Management ✅
|
||||||
|
- Automatic purging of old payloads (keeps metadata)
|
||||||
|
- Configurable retention period
|
||||||
|
- Efficient storage usage
|
||||||
|
|
||||||
|
### 6. Observability ✅
|
||||||
|
- Webhook requests tracked in database
|
||||||
|
- Status history maintained
|
||||||
|
- Source IP and user agent logged
|
||||||
|
- Retry count tracked
|
||||||
|
- Error messages stored
|
||||||
|
|
||||||
|
## Configuration
|
||||||
|
|
||||||
|
### Existing Config (config.yaml)
|
||||||
|
No changes required! The existing `alpine_bits_auth` section is still read and synced to the database automatically:
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
alpine_bits_auth:
|
||||||
|
- hotel_id: "123"
|
||||||
|
hotel_name: "Example Hotel"
|
||||||
|
username: "hotel123"
|
||||||
|
password: "secret" # Will be hashed with bcrypt in database
|
||||||
|
meta_account: "1234567890"
|
||||||
|
google_account: "9876543210"
|
||||||
|
push_endpoint:
|
||||||
|
url: "https://example.com/push"
|
||||||
|
token: "token123"
|
||||||
|
username: "pushuser"
|
||||||
|
```
|
||||||
|
|
||||||
|
### New Optional Config
|
||||||
|
You can add webhook-specific configuration:
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
webhooks:
|
||||||
|
stale_timeout_minutes: 10 # Timeout for stuck webhooks (default: 10)
|
||||||
|
payload_retention_days: 7 # Days before purging payload_json (default: 7)
|
||||||
|
cleanup_interval_minutes: 5 # How often to run cleanup (default: 5)
|
||||||
|
```
|
||||||
|
|
||||||
|
## Database Queries
|
||||||
|
|
||||||
|
### View All Webhook URLs
|
||||||
|
```sql
|
||||||
|
SELECT
|
||||||
|
h.hotel_id,
|
||||||
|
h.hotel_name,
|
||||||
|
we.webhook_type,
|
||||||
|
we.webhook_secret,
|
||||||
|
'https://your-domain.com/api/webhook/' || we.webhook_secret AS webhook_url
|
||||||
|
FROM hotels h
|
||||||
|
JOIN webhook_endpoints we ON h.hotel_id = we.hotel_id
|
||||||
|
WHERE we.is_enabled = true
|
||||||
|
ORDER BY h.hotel_id, we.webhook_type;
|
||||||
|
```
|
||||||
|
|
||||||
|
### View Recent Webhook Activity
|
||||||
|
```sql
|
||||||
|
SELECT
|
||||||
|
wr.id,
|
||||||
|
wr.created_at,
|
||||||
|
h.hotel_name,
|
||||||
|
we.webhook_type,
|
||||||
|
wr.status,
|
||||||
|
wr.retry_count,
|
||||||
|
wr.created_customer_id,
|
||||||
|
wr.created_reservation_id
|
||||||
|
FROM webhook_requests wr
|
||||||
|
JOIN webhook_endpoints we ON wr.webhook_endpoint_id = we.id
|
||||||
|
JOIN hotels h ON we.hotel_id = h.hotel_id
|
||||||
|
ORDER BY wr.created_at DESC
|
||||||
|
LIMIT 50;
|
||||||
|
```
|
||||||
|
|
||||||
|
### View Failed Webhooks
|
||||||
|
```sql
|
||||||
|
SELECT
|
||||||
|
wr.id,
|
||||||
|
wr.created_at,
|
||||||
|
h.hotel_name,
|
||||||
|
we.webhook_type,
|
||||||
|
wr.retry_count,
|
||||||
|
wr.last_error
|
||||||
|
FROM webhook_requests wr
|
||||||
|
JOIN webhook_endpoints we ON wr.webhook_endpoint_id = we.id
|
||||||
|
JOIN hotels h ON we.hotel_id = h.hotel_id
|
||||||
|
WHERE wr.status = 'failed'
|
||||||
|
ORDER BY wr.created_at DESC;
|
||||||
|
```
|
||||||
|
|
||||||
|
### Webhook Statistics
|
||||||
|
```sql
|
||||||
|
SELECT
|
||||||
|
h.hotel_name,
|
||||||
|
we.webhook_type,
|
||||||
|
COUNT(*) AS total_requests,
|
||||||
|
SUM(CASE WHEN wr.status = 'completed' THEN 1 ELSE 0 END) AS completed,
|
||||||
|
SUM(CASE WHEN wr.status = 'failed' THEN 1 ELSE 0 END) AS failed,
|
||||||
|
SUM(CASE WHEN wr.status = 'processing' THEN 1 ELSE 0 END) AS processing,
|
||||||
|
AVG(EXTRACT(EPOCH FROM (wr.processing_completed_at - wr.processing_started_at))) AS avg_processing_seconds
|
||||||
|
FROM webhook_requests wr
|
||||||
|
JOIN webhook_endpoints we ON wr.webhook_endpoint_id = we.id
|
||||||
|
JOIN hotels h ON we.hotel_id = h.hotel_id
|
||||||
|
WHERE wr.created_at > NOW() - INTERVAL '7 days'
|
||||||
|
GROUP BY h.hotel_name, we.webhook_type
|
||||||
|
ORDER BY total_requests DESC;
|
||||||
|
```
|
||||||
|
|
||||||
|
## Security Considerations
|
||||||
|
|
||||||
|
### 1. Password Storage
|
||||||
|
- Passwords are hashed with bcrypt (12 rounds)
|
||||||
|
- Plain text passwords never stored in database
|
||||||
|
- Config sync does NOT update password_hash (security)
|
||||||
|
- To change password: manually update database or delete hotel record
|
||||||
|
|
||||||
|
### 2. Webhook Secrets
|
||||||
|
- Generated using `secrets.token_urlsafe(48)` (cryptographically secure)
|
||||||
|
- 64-character URL-safe strings
|
||||||
|
- Unique per endpoint
|
||||||
|
- Act as API keys (no additional auth needed)
|
||||||
|
|
||||||
|
### 3. Payload Size Limits
|
||||||
|
- 10MB maximum payload size
|
||||||
|
- Prevents memory exhaustion attacks
|
||||||
|
- Configurable in code
|
||||||
|
|
||||||
|
### 4. Rate Limiting
|
||||||
|
- Existing rate limiting still applies
|
||||||
|
- Uses slowapi with configured limits
|
||||||
|
|
||||||
|
## Next Steps
|
||||||
|
|
||||||
|
1. **Test Migration** - Run `uv run alembic upgrade head` in test environment
|
||||||
|
2. **Verify Sync** - Start application and check logs for hotel sync statistics
|
||||||
|
3. **Test Webhook URLs** - Send test payloads to new unified endpoint
|
||||||
|
4. **Monitor Performance** - Watch for any issues with concurrent webhooks
|
||||||
|
5. **Add Tests** - Write comprehensive test suite
|
||||||
|
6. **Update Documentation** - Document webhook URLs for external integrations
|
||||||
|
7. **Consider Admin UI** - Build admin interface for managing hotels/webhooks
|
||||||
|
|
||||||
|
## Files Modified
|
||||||
|
|
||||||
|
1. `src/alpine_bits_python/db.py` - Added Hotel, WebhookEndpoint, WebhookRequest models
|
||||||
|
2. `src/alpine_bits_python/db_setup.py` - Added config sync call
|
||||||
|
3. `src/alpine_bits_python/api.py` - Added unified handler, cleanup functions, processor initialization
|
||||||
|
4. `src/alpine_bits_python/hotel_service.py` - NEW FILE
|
||||||
|
5. `src/alpine_bits_python/webhook_processor.py` - NEW FILE
|
||||||
|
6. `alembic/versions/2025_11_25_1155-*.py` - NEW MIGRATION
|
||||||
|
|
||||||
|
## Rollback Plan
|
||||||
|
|
||||||
|
If issues are discovered:
|
||||||
|
|
||||||
|
1. **Rollback Migration:**
|
||||||
|
```bash
|
||||||
|
uv run alembic downgrade -1
|
||||||
|
```
|
||||||
|
|
||||||
|
2. **Revert Code:**
|
||||||
|
```bash
|
||||||
|
git revert <commit-hash>
|
||||||
|
```
|
||||||
|
|
||||||
|
3. **Fallback:**
|
||||||
|
- Legacy endpoints (`/webhook/wix-form`, `/webhook/generic`) still work
|
||||||
|
- No breaking changes to existing integrations
|
||||||
|
- Can disable new unified handler by removing route
|
||||||
|
|
||||||
|
## Success Metrics
|
||||||
|
|
||||||
|
- ✅ No duplicate customers/reservations created from concurrent webhooks
|
||||||
|
- ✅ Webhook processing latency maintained
|
||||||
|
- ✅ Zero data loss during migration
|
||||||
|
- ✅ Backward compatibility maintained
|
||||||
|
- ✅ Memory usage stable (payload purging working)
|
||||||
|
- ✅ Error rate < 1% for webhook processing
|
||||||
|
|
||||||
|
## Support
|
||||||
|
|
||||||
|
For issues or questions:
|
||||||
|
1. Check application logs for errors
|
||||||
|
2. Query `webhook_requests` table for failed webhooks
|
||||||
|
3. Review this document for configuration options
|
||||||
|
4. Check GitHub issues for known problems
|
||||||
@@ -0,0 +1,120 @@
|
|||||||
|
"""add_hotels_and_webhook_tables
|
||||||
|
|
||||||
|
Revision ID: e7ee03d8f430
|
||||||
|
Revises: a1b2c3d4e5f6
|
||||||
|
Create Date: 2025-11-25 11:55:18.872715
|
||||||
|
|
||||||
|
"""
|
||||||
|
from typing import Sequence, Union
|
||||||
|
|
||||||
|
from alembic import op
|
||||||
|
import sqlalchemy as sa
|
||||||
|
|
||||||
|
from alpine_bits_python.const import WebhookStatus
|
||||||
|
|
||||||
|
|
||||||
|
# revision identifiers, used by Alembic.
|
||||||
|
revision: str = 'e7ee03d8f430'
|
||||||
|
down_revision: Union[str, Sequence[str], None] = 'a1b2c3d4e5f6'
|
||||||
|
branch_labels: Union[str, Sequence[str], None] = None
|
||||||
|
depends_on: Union[str, Sequence[str], None] = None
|
||||||
|
|
||||||
|
|
||||||
|
def upgrade() -> None:
|
||||||
|
"""Upgrade schema."""
|
||||||
|
# Create hotels table
|
||||||
|
op.create_table(
|
||||||
|
'hotels',
|
||||||
|
sa.Column('id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('hotel_id', sa.String(length=50), nullable=False),
|
||||||
|
sa.Column('hotel_name', sa.String(length=200), nullable=False),
|
||||||
|
sa.Column('username', sa.String(length=100), nullable=False),
|
||||||
|
sa.Column('password_hash', sa.String(length=200), nullable=False),
|
||||||
|
sa.Column('meta_account_id', sa.String(length=50), nullable=True),
|
||||||
|
sa.Column('google_account_id', sa.String(length=50), nullable=True),
|
||||||
|
sa.Column('push_endpoint_url', sa.String(length=500), nullable=True),
|
||||||
|
sa.Column('push_endpoint_token', sa.String(length=200), nullable=True),
|
||||||
|
sa.Column('push_endpoint_username', sa.String(length=100), nullable=True),
|
||||||
|
sa.Column('created_at', sa.DateTime(timezone=True), nullable=False),
|
||||||
|
sa.Column('updated_at', sa.DateTime(timezone=True), nullable=False),
|
||||||
|
sa.Column('is_active', sa.Boolean(), nullable=False, default=True),
|
||||||
|
sa.PrimaryKeyConstraint('id')
|
||||||
|
)
|
||||||
|
op.create_index(op.f('ix_hotels_hotel_id'), 'hotels', ['hotel_id'], unique=True)
|
||||||
|
op.create_index(op.f('ix_hotels_username'), 'hotels', ['username'], unique=True)
|
||||||
|
op.create_index(op.f('ix_hotels_is_active'), 'hotels', ['is_active'], unique=False)
|
||||||
|
|
||||||
|
# Create webhook_endpoints table
|
||||||
|
op.create_table(
|
||||||
|
'webhook_endpoints',
|
||||||
|
sa.Column('id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('hotel_id', sa.String(length=50), nullable=False),
|
||||||
|
sa.Column('webhook_secret', sa.String(length=64), nullable=False),
|
||||||
|
sa.Column('webhook_type', sa.String(length=50), nullable=False),
|
||||||
|
sa.Column('description', sa.String(length=200), nullable=True),
|
||||||
|
sa.Column('is_enabled', sa.Boolean(), nullable=False, default=True),
|
||||||
|
sa.Column('created_at', sa.DateTime(timezone=True), nullable=False),
|
||||||
|
sa.ForeignKeyConstraint(['hotel_id'], ['hotels.hotel_id'], ),
|
||||||
|
sa.PrimaryKeyConstraint('id')
|
||||||
|
)
|
||||||
|
op.create_index(op.f('ix_webhook_endpoints_hotel_id'), 'webhook_endpoints', ['hotel_id'], unique=False)
|
||||||
|
op.create_index(op.f('ix_webhook_endpoints_webhook_secret'), 'webhook_endpoints', ['webhook_secret'], unique=True)
|
||||||
|
op.create_index('idx_webhook_endpoint_hotel_type', 'webhook_endpoints', ['hotel_id', 'webhook_type'], unique=False)
|
||||||
|
|
||||||
|
# Create webhook_requests table
|
||||||
|
op.create_table(
|
||||||
|
'webhook_requests',
|
||||||
|
sa.Column('id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('payload_hash', sa.String(length=64), nullable=False),
|
||||||
|
sa.Column('webhook_endpoint_id', sa.Integer(), nullable=True),
|
||||||
|
sa.Column('hotel_id', sa.String(length=50), nullable=True),
|
||||||
|
sa.Column('status', sa.String(length=20), nullable=False, default=WebhookStatus.PENDING.value),
|
||||||
|
sa.Column('processing_started_at', sa.DateTime(timezone=True), nullable=True),
|
||||||
|
sa.Column('processing_completed_at', sa.DateTime(timezone=True), nullable=True),
|
||||||
|
sa.Column('retry_count', sa.Integer(), nullable=True, default=0),
|
||||||
|
sa.Column('last_error', sa.String(length=2000), nullable=True),
|
||||||
|
sa.Column('payload_json', sa.JSON(), nullable=True),
|
||||||
|
sa.Column('purged_at', sa.DateTime(timezone=True), nullable=True),
|
||||||
|
sa.Column('created_at', sa.DateTime(timezone=True), nullable=False),
|
||||||
|
sa.Column('source_ip', sa.String(length=45), nullable=True),
|
||||||
|
sa.Column('user_agent', sa.String(length=500), nullable=True),
|
||||||
|
sa.Column('created_customer_id', sa.Integer(), nullable=True),
|
||||||
|
sa.Column('created_reservation_id', sa.Integer(), nullable=True),
|
||||||
|
sa.ForeignKeyConstraint(['webhook_endpoint_id'], ['webhook_endpoints.id'], ),
|
||||||
|
sa.ForeignKeyConstraint(['hotel_id'], ['hotels.hotel_id'], ),
|
||||||
|
sa.ForeignKeyConstraint(['created_customer_id'], ['customers.id'], ),
|
||||||
|
sa.ForeignKeyConstraint(['created_reservation_id'], ['reservations.id'], ),
|
||||||
|
sa.PrimaryKeyConstraint('id')
|
||||||
|
)
|
||||||
|
op.create_index(op.f('ix_webhook_requests_payload_hash'), 'webhook_requests', ['payload_hash'], unique=True)
|
||||||
|
op.create_index(op.f('ix_webhook_requests_webhook_endpoint_id'), 'webhook_requests', ['webhook_endpoint_id'], unique=False)
|
||||||
|
op.create_index(op.f('ix_webhook_requests_hotel_id'), 'webhook_requests', ['hotel_id'], unique=False)
|
||||||
|
op.create_index(op.f('ix_webhook_requests_status'), 'webhook_requests', ['status'], unique=False)
|
||||||
|
op.create_index(op.f('ix_webhook_requests_created_at'), 'webhook_requests', ['created_at'], unique=False)
|
||||||
|
op.create_index('idx_webhook_status_created', 'webhook_requests', ['status', 'created_at'], unique=False)
|
||||||
|
op.create_index('idx_webhook_hotel_created', 'webhook_requests', ['hotel_id', 'created_at'], unique=False)
|
||||||
|
op.create_index('idx_webhook_purge_candidate', 'webhook_requests', ['status', 'purged_at', 'created_at'], unique=False)
|
||||||
|
|
||||||
|
|
||||||
|
def downgrade() -> None:
|
||||||
|
"""Downgrade schema."""
|
||||||
|
# Drop tables in reverse order (respecting foreign key constraints)
|
||||||
|
op.drop_index('idx_webhook_purge_candidate', table_name='webhook_requests')
|
||||||
|
op.drop_index('idx_webhook_hotel_created', table_name='webhook_requests')
|
||||||
|
op.drop_index('idx_webhook_status_created', table_name='webhook_requests')
|
||||||
|
op.drop_index(op.f('ix_webhook_requests_created_at'), table_name='webhook_requests')
|
||||||
|
op.drop_index(op.f('ix_webhook_requests_status'), table_name='webhook_requests')
|
||||||
|
op.drop_index(op.f('ix_webhook_requests_hotel_id'), table_name='webhook_requests')
|
||||||
|
op.drop_index(op.f('ix_webhook_requests_webhook_endpoint_id'), table_name='webhook_requests')
|
||||||
|
op.drop_index(op.f('ix_webhook_requests_payload_hash'), table_name='webhook_requests')
|
||||||
|
op.drop_table('webhook_requests')
|
||||||
|
|
||||||
|
op.drop_index('idx_webhook_endpoint_hotel_type', table_name='webhook_endpoints')
|
||||||
|
op.drop_index(op.f('ix_webhook_endpoints_webhook_secret'), table_name='webhook_endpoints')
|
||||||
|
op.drop_index(op.f('ix_webhook_endpoints_hotel_id'), table_name='webhook_endpoints')
|
||||||
|
op.drop_table('webhook_endpoints')
|
||||||
|
|
||||||
|
op.drop_index(op.f('ix_hotels_is_active'), table_name='hotels')
|
||||||
|
op.drop_index(op.f('ix_hotels_username'), table_name='hotels')
|
||||||
|
op.drop_index(op.f('ix_hotels_hotel_id'), table_name='hotels')
|
||||||
|
op.drop_table('hotels')
|
||||||
@@ -0,0 +1,108 @@
|
|||||||
|
"""Add hotel inventory and room availability tables
|
||||||
|
|
||||||
|
Revision ID: b2cfe2d3aabc
|
||||||
|
Revises: e7ee03d8f430
|
||||||
|
Create Date: 2025-11-27 12:00:00.000000
|
||||||
|
|
||||||
|
"""
|
||||||
|
from collections.abc import Sequence
|
||||||
|
|
||||||
|
from alembic import op
|
||||||
|
import sqlalchemy as sa
|
||||||
|
|
||||||
|
|
||||||
|
# revision identifiers, used by Alembic.
|
||||||
|
revision: str = "b2cfe2d3aabc"
|
||||||
|
down_revision: str | Sequence[str] | None = "e7ee03d8f430"
|
||||||
|
branch_labels: str | Sequence[str] | None = None
|
||||||
|
depends_on: str | Sequence[str] | None = None
|
||||||
|
|
||||||
|
|
||||||
|
def upgrade() -> None:
|
||||||
|
"""Upgrade schema with inventory and availability tables."""
|
||||||
|
op.create_table(
|
||||||
|
"hotel_inventory",
|
||||||
|
sa.Column("id", sa.Integer(), nullable=False),
|
||||||
|
sa.Column("hotel_id", sa.String(length=50), nullable=False),
|
||||||
|
sa.Column("inv_type_code", sa.String(length=8), nullable=False),
|
||||||
|
sa.Column("inv_code", sa.String(length=16), nullable=True),
|
||||||
|
sa.Column("room_name", sa.String(length=200), nullable=True),
|
||||||
|
sa.Column("max_occupancy", sa.Integer(), nullable=True),
|
||||||
|
sa.Column("source", sa.String(length=20), nullable=False),
|
||||||
|
sa.Column("first_seen", sa.DateTime(timezone=True), nullable=False),
|
||||||
|
sa.Column("last_updated", sa.DateTime(timezone=True), nullable=False),
|
||||||
|
sa.ForeignKeyConstraint(["hotel_id"], ["hotels.hotel_id"], ondelete="CASCADE"),
|
||||||
|
sa.PrimaryKeyConstraint("id"),
|
||||||
|
)
|
||||||
|
op.create_index(
|
||||||
|
op.f("ix_hotel_inventory_hotel_id"),
|
||||||
|
"hotel_inventory",
|
||||||
|
["hotel_id"],
|
||||||
|
unique=False,
|
||||||
|
)
|
||||||
|
op.create_index(
|
||||||
|
op.f("ix_hotel_inventory_inv_type_code"),
|
||||||
|
"hotel_inventory",
|
||||||
|
["inv_type_code"],
|
||||||
|
unique=False,
|
||||||
|
)
|
||||||
|
op.create_index(
|
||||||
|
op.f("ix_hotel_inventory_inv_code"),
|
||||||
|
"hotel_inventory",
|
||||||
|
["inv_code"],
|
||||||
|
unique=False,
|
||||||
|
)
|
||||||
|
op.create_index(
|
||||||
|
"uq_hotel_inventory_unique_key",
|
||||||
|
"hotel_inventory",
|
||||||
|
["hotel_id", "inv_type_code", sa.text("COALESCE(inv_code, '')")],
|
||||||
|
unique=True,
|
||||||
|
)
|
||||||
|
|
||||||
|
op.create_table(
|
||||||
|
"room_availability",
|
||||||
|
sa.Column("id", sa.Integer(), nullable=False),
|
||||||
|
sa.Column("inventory_id", sa.Integer(), nullable=False),
|
||||||
|
sa.Column("date", sa.Date(), nullable=False),
|
||||||
|
sa.Column("count_type_2", sa.Integer(), nullable=True),
|
||||||
|
sa.Column("count_type_6", sa.Integer(), nullable=True),
|
||||||
|
sa.Column("count_type_9", sa.Integer(), nullable=True),
|
||||||
|
sa.Column("is_closing_season", sa.Boolean(), nullable=False, server_default=sa.false()),
|
||||||
|
sa.Column("last_updated", sa.DateTime(timezone=True), nullable=False),
|
||||||
|
sa.Column("update_type", sa.String(length=20), nullable=False),
|
||||||
|
sa.ForeignKeyConstraint(["inventory_id"], ["hotel_inventory.id"], ondelete="CASCADE"),
|
||||||
|
sa.PrimaryKeyConstraint("id"),
|
||||||
|
sa.UniqueConstraint("inventory_id", "date", name="uq_room_availability_unique_key"),
|
||||||
|
)
|
||||||
|
op.create_index(
|
||||||
|
op.f("ix_room_availability_inventory_id"),
|
||||||
|
"room_availability",
|
||||||
|
["inventory_id"],
|
||||||
|
unique=False,
|
||||||
|
)
|
||||||
|
op.create_index(
|
||||||
|
op.f("ix_room_availability_date"),
|
||||||
|
"room_availability",
|
||||||
|
["date"],
|
||||||
|
unique=False,
|
||||||
|
)
|
||||||
|
op.create_index(
|
||||||
|
"idx_room_availability_inventory_date",
|
||||||
|
"room_availability",
|
||||||
|
["inventory_id", "date"],
|
||||||
|
unique=False,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def downgrade() -> None:
|
||||||
|
"""Downgrade schema by removing availability tables."""
|
||||||
|
op.drop_index("idx_room_availability_inventory_date", table_name="room_availability")
|
||||||
|
op.drop_index(op.f("ix_room_availability_date"), table_name="room_availability")
|
||||||
|
op.drop_index(op.f("ix_room_availability_inventory_id"), table_name="room_availability")
|
||||||
|
op.drop_table("room_availability")
|
||||||
|
|
||||||
|
op.drop_index("uq_hotel_inventory_unique_key", table_name="hotel_inventory")
|
||||||
|
op.drop_index(op.f("ix_hotel_inventory_inv_code"), table_name="hotel_inventory")
|
||||||
|
op.drop_index(op.f("ix_hotel_inventory_inv_type_code"), table_name="hotel_inventory")
|
||||||
|
op.drop_index(op.f("ix_hotel_inventory_hotel_id"), table_name="hotel_inventory")
|
||||||
|
op.drop_table("hotel_inventory")
|
||||||
@@ -392923,3 +392923,74 @@ DETAIL: Key (hotel_id, guest_id)=(39054_001, 28275) is not present in table "co
|
|||||||
[SQL: INSERT INTO conversions (reservation_id, customer_id, hashed_customer_id, hotel_id, guest_id, pms_reservation_id, reservation_number, reservation_date, creation_time, reservation_type, booking_channel, advertising_medium, advertising_partner, advertising_campagne, directly_attributable, guest_matched, created_at, updated_at) VALUES ($1::INTEGER, $2::INTEGER, $3::INTEGER, $4::VARCHAR, $5::VARCHAR, $6::VARCHAR, $7::VARCHAR, $8::DATE, $9::TIMESTAMP WITH TIME ZONE, $10::VARCHAR, $11::VARCHAR, $12::VARCHAR, $13::VARCHAR, $14::VARCHAR, $15::BOOLEAN, $16::BOOLEAN, $17::TIMESTAMP WITH TIME ZONE, $18::TIMESTAMP WITH TIME ZONE) RETURNING conversions.id]
|
[SQL: INSERT INTO conversions (reservation_id, customer_id, hashed_customer_id, hotel_id, guest_id, pms_reservation_id, reservation_number, reservation_date, creation_time, reservation_type, booking_channel, advertising_medium, advertising_partner, advertising_campagne, directly_attributable, guest_matched, created_at, updated_at) VALUES ($1::INTEGER, $2::INTEGER, $3::INTEGER, $4::VARCHAR, $5::VARCHAR, $6::VARCHAR, $7::VARCHAR, $8::DATE, $9::TIMESTAMP WITH TIME ZONE, $10::VARCHAR, $11::VARCHAR, $12::VARCHAR, $13::VARCHAR, $14::VARCHAR, $15::BOOLEAN, $16::BOOLEAN, $17::TIMESTAMP WITH TIME ZONE, $18::TIMESTAMP WITH TIME ZONE) RETURNING conversions.id]
|
||||||
[parameters: (None, None, None, '39054_001', '28275', '833', '532', datetime.date(2014, 5, 24), datetime.datetime(2014, 5, 24, 13, 16, 27), 'reservation', None, None, None, None, False, False, datetime.datetime(2025, 11, 19, 16, 23, 58, 58789), datetime.datetime(2025, 11, 19, 16, 23, 58, 58791))]
|
[parameters: (None, None, None, '39054_001', '28275', '833', '532', datetime.date(2014, 5, 24), datetime.datetime(2014, 5, 24, 13, 16, 27), 'reservation', None, None, None, None, False, False, datetime.datetime(2025, 11, 19, 16, 23, 58, 58789), datetime.datetime(2025, 11, 19, 16, 23, 58, 58791))]
|
||||||
(Background on this error at: https://sqlalche.me/e/20/gkpj)
|
(Background on this error at: https://sqlalche.me/e/20/gkpj)
|
||||||
|
2025-11-25 12:02:21 - root - INFO - Logging to file: config/alpinebits.log
|
||||||
|
2025-11-25 12:02:21 - root - INFO - Logging configured at INFO level
|
||||||
|
2025-11-25 12:02:21 - alpine_bits_python.notification_service - INFO - Registered notification backend: pushover
|
||||||
|
2025-11-25 12:02:21 - alpine_bits_python.notification_manager - INFO - Registered pushover backend with priority 0
|
||||||
|
2025-11-25 12:02:21 - alpine_bits_python.notification_manager - INFO - Notification service configured with backends: ['pushover']
|
||||||
|
2025-11-25 12:02:21 - alpine_bits_python.api - INFO - Application startup initiated (primary_worker=True)
|
||||||
|
2025-11-25 12:02:21 - alpine_bits_python.db - INFO - Configured database schema: alpinebits
|
||||||
|
2025-11-25 12:02:21 - alpine_bits_python.db - INFO - Setting PostgreSQL search_path to: alpinebits,public
|
||||||
|
2025-11-25 12:02:21 - alpine_bits_python.alpinebits_server - INFO - Initializing action instance for AlpineBitsActionName.OTA_HOTEL_NOTIF_REPORT
|
||||||
|
2025-11-25 12:02:21 - alpine_bits_python.alpinebits_server - INFO - Initializing action instance for AlpineBitsActionName.OTA_PING
|
||||||
|
2025-11-25 12:02:21 - alpine_bits_python.alpinebits_server - INFO - Initializing action instance for AlpineBitsActionName.OTA_HOTEL_RES_NOTIF_GUEST_REQUESTS
|
||||||
|
2025-11-25 12:02:21 - alpine_bits_python.alpinebits_server - INFO - Initializing action instance for AlpineBitsActionName.OTA_READ
|
||||||
|
2025-11-25 12:02:21 - alpine_bits_python.webhook_processor - INFO - Registered webhook processor: wix_form
|
||||||
|
2025-11-25 12:02:21 - alpine_bits_python.webhook_processor - INFO - Registered webhook processor: generic
|
||||||
|
2025-11-25 12:02:21 - alpine_bits_python.webhook_processor - INFO - Webhook processors initialized
|
||||||
|
2025-11-25 12:02:21 - alpine_bits_python.api - INFO - Webhook processors initialized
|
||||||
|
2025-11-25 12:02:21 - alpine_bits_python.api - INFO - Hotel 39054_001 has no push_endpoint configured
|
||||||
|
2025-11-25 12:02:21 - alpine_bits_python.api - INFO - Hotel 135 has no push_endpoint configured
|
||||||
|
2025-11-25 12:02:21 - alpine_bits_python.api - INFO - Hotel 39052_001 has no push_endpoint configured
|
||||||
|
2025-11-25 12:02:21 - alpine_bits_python.api - INFO - Hotel 39040_001 has no push_endpoint configured
|
||||||
|
2025-11-25 12:02:21 - alpine_bits_python.api - INFO - Running startup tasks (primary worker)...
|
||||||
|
2025-11-25 12:03:13 - root - INFO - Logging to file: config/alpinebits.log
|
||||||
|
2025-11-25 12:03:13 - root - INFO - Logging configured at INFO level
|
||||||
|
2025-11-25 12:03:13 - alpine_bits_python.notification_service - INFO - Registered notification backend: pushover
|
||||||
|
2025-11-25 12:03:13 - alpine_bits_python.notification_manager - INFO - Registered pushover backend with priority 0
|
||||||
|
2025-11-25 12:03:13 - alpine_bits_python.notification_manager - INFO - Notification service configured with backends: ['pushover']
|
||||||
|
2025-11-25 12:03:13 - alpine_bits_python.api - INFO - Application startup initiated (primary_worker=True)
|
||||||
|
2025-11-25 12:03:13 - alpine_bits_python.db - INFO - Configured database schema: alpinebits
|
||||||
|
2025-11-25 12:03:13 - alpine_bits_python.db - INFO - Setting PostgreSQL search_path to: alpinebits,public
|
||||||
|
2025-11-25 12:03:13 - alpine_bits_python.alpinebits_server - INFO - Initializing action instance for AlpineBitsActionName.OTA_HOTEL_NOTIF_REPORT
|
||||||
|
2025-11-25 12:03:13 - alpine_bits_python.alpinebits_server - INFO - Initializing action instance for AlpineBitsActionName.OTA_PING
|
||||||
|
2025-11-25 12:03:13 - alpine_bits_python.alpinebits_server - INFO - Initializing action instance for AlpineBitsActionName.OTA_HOTEL_RES_NOTIF_GUEST_REQUESTS
|
||||||
|
2025-11-25 12:03:13 - alpine_bits_python.alpinebits_server - INFO - Initializing action instance for AlpineBitsActionName.OTA_READ
|
||||||
|
2025-11-25 12:03:13 - alpine_bits_python.webhook_processor - INFO - Registered webhook processor: wix_form
|
||||||
|
2025-11-25 12:03:13 - alpine_bits_python.webhook_processor - INFO - Registered webhook processor: generic
|
||||||
|
2025-11-25 12:03:13 - alpine_bits_python.webhook_processor - INFO - Webhook processors initialized
|
||||||
|
2025-11-25 12:03:13 - alpine_bits_python.api - INFO - Webhook processors initialized
|
||||||
|
2025-11-25 12:03:13 - alpine_bits_python.api - INFO - Hotel 39054_001 has no push_endpoint configured
|
||||||
|
2025-11-25 12:03:13 - alpine_bits_python.api - INFO - Hotel 135 has no push_endpoint configured
|
||||||
|
2025-11-25 12:03:13 - alpine_bits_python.api - INFO - Hotel 39052_001 has no push_endpoint configured
|
||||||
|
2025-11-25 12:03:13 - alpine_bits_python.api - INFO - Hotel 39040_001 has no push_endpoint configured
|
||||||
|
2025-11-25 12:03:13 - alpine_bits_python.api - INFO - Running startup tasks (primary worker)...
|
||||||
|
2025-11-25 12:03:14 - alpine_bits_python.hotel_service - INFO - Created hotel: 39054_001
|
||||||
|
2025-11-25 12:03:14 - alpine_bits_python.hotel_service - INFO - Created webhook endpoint for hotel 39054_001, type=wix_form, secret=E3wZiShNY47_KqwZUxBe7BJKluorJYj6qHfclrQg1UnyurYojyw_f0Z1KJBrXIuG
|
||||||
|
2025-11-25 12:03:14 - alpine_bits_python.hotel_service - INFO - Created webhook endpoint for hotel 39054_001, type=generic, secret=PmhN4o5MR4VYR9U04kwgp4nRnJ2FMfxm1V2TARIh46Qx49Iy0sndPzeIYXM31KTU
|
||||||
|
2025-11-25 12:03:14 - alpine_bits_python.hotel_service - INFO - Created hotel: 135
|
||||||
|
2025-11-25 12:03:14 - alpine_bits_python.hotel_service - INFO - Created webhook endpoint for hotel 135, type=wix_form, secret=Rb0Dp1mFHZH5vbF66ZsUXoSiIVJdVjskxjlz-PLYXglaJ_DXGo7B7dtw0xbO15_O
|
||||||
|
2025-11-25 12:03:14 - alpine_bits_python.hotel_service - INFO - Created webhook endpoint for hotel 135, type=generic, secret=KCblY6u535uAgQ-nk0DS24FsilkS73hsplVecXy8vFg0GqDFX1lsF2U4JGmPURtf
|
||||||
|
2025-11-25 12:03:14 - alpine_bits_python.hotel_service - INFO - Created hotel: 39052_001
|
||||||
|
2025-11-25 12:03:14 - alpine_bits_python.hotel_service - INFO - Created webhook endpoint for hotel 39052_001, type=wix_form, secret=d-oKOTZ4GqcNIdnR6cMcHtpEUWXNpFgwbc0qRXS_9m1J2vJRYHTjWs3pb8XvF3B_
|
||||||
|
2025-11-25 12:03:14 - alpine_bits_python.hotel_service - INFO - Created webhook endpoint for hotel 39052_001, type=generic, secret=bdO04CoGOHyl7P8zFOB8dNxaxkccNNyXgM_za9pPiFqm4LmUM4KQ0l2qfOTu7gyM
|
||||||
|
2025-11-25 12:03:14 - alpine_bits_python.hotel_service - INFO - Created hotel: 39040_001
|
||||||
|
2025-11-25 12:03:14 - alpine_bits_python.hotel_service - INFO - Created webhook endpoint for hotel 39040_001, type=wix_form, secret=Arj5jksgLxgJcad9OAGIZzWfF1x1g6g965EZKGp-njDsF2oK-jiODYlN4HiO46cz
|
||||||
|
2025-11-25 12:03:14 - alpine_bits_python.hotel_service - INFO - Created webhook endpoint for hotel 39040_001, type=generic, secret=A-_w63IXXmwsztd1pN6wDEtvr_oKO1GRROO5ff9lad0VTAh7WH5mzqUwdi9H8Be-
|
||||||
|
2025-11-25 12:03:14 - alpine_bits_python.hotel_service - INFO - Config sync complete: 4 hotels created, 0 updated, 8 endpoints created
|
||||||
|
2025-11-25 12:03:14 - alpine_bits_python.db_setup - INFO - Config sync: 4 hotels created, 0 updated, 8 endpoints created
|
||||||
|
2025-11-25 12:03:15 - alpine_bits_python.db_setup - INFO - Backfilling advertising account IDs for existing reservations...
|
||||||
|
2025-11-25 12:03:15 - alpine_bits_python.db_setup - INFO - Found 4 hotel(s) with account configurations
|
||||||
|
2025-11-25 12:03:15 - alpine_bits_python.db_setup - INFO - Backfilling usernames for existing acked_requests...
|
||||||
|
2025-11-25 12:03:15 - alpine_bits_python.db_setup - INFO - Found 4 hotel(s) with usernames in config
|
||||||
|
2025-11-25 12:03:15 - alpine_bits_python.api - INFO - Startup tasks completed
|
||||||
|
2025-11-25 12:03:15 - alpine_bits_python.api - INFO - Webhook periodic cleanup task started
|
||||||
|
2025-11-25 12:03:15 - alpine_bits_python.api - INFO - Application startup complete
|
||||||
|
2025-11-25 12:03:35 - alpine_bits_python.api - INFO - Application shutdown initiated
|
||||||
|
2025-11-25 12:03:35 - alpine_bits_python.api - INFO - Webhook cleanup task cancelled
|
||||||
|
2025-11-25 12:03:35 - alpine_bits_python.api - INFO - Webhook cleanup task stopped
|
||||||
|
2025-11-25 12:03:35 - alpine_bits_python.email_service - INFO - Shutting down email service thread pool
|
||||||
|
2025-11-25 12:03:35 - alpine_bits_python.email_service - INFO - Email service thread pool shut down complete
|
||||||
|
2025-11-25 12:03:35 - alpine_bits_python.api - INFO - Email service shut down
|
||||||
|
2025-11-25 12:03:35 - alpine_bits_python.api - INFO - Application shutdown complete
|
||||||
|
2025-11-25 12:03:35 - alpine_bits_python.worker_coordination - INFO - Released primary worker lock (pid=22943)
|
||||||
|
|||||||
131
docs/alpinebits_docs/chapter4/4.1_free_rooms.md
Normal file
131
docs/alpinebits_docs/chapter4/4.1_free_rooms.md
Normal file
@@ -0,0 +1,131 @@
|
|||||||
|
# 4.1 FreeRooms: Room Availability Notifications
|
||||||
|
|
||||||
|
When `action=OTA_HotelInvCountNotif:FreeRooms`, the client sends room availability updates to the server. Servers must support at least one capability: `OTA_HotelInvCountNotif_accept_rooms` (distinct rooms) or `OTA_HotelInvCountNotif_accept_categories` (room categories); they may support both.
|
||||||
|
|
||||||
|
## 4.1.1 Client Request (`OTA_HotelInvCountNotifRQ`)
|
||||||
|
|
||||||
|
- The payload is a single `OTA_HotelInvCountNotifRQ` with exactly one `Inventories` element, so only one hotel is covered per request. `HotelCode` is mandatory; `HotelName` is optional.
|
||||||
|
- Example (outer structure):
|
||||||
|
|
||||||
|
```xml
|
||||||
|
<?xml version="1.0" encoding="UTF-8"?>
|
||||||
|
<OTA_HotelInvCountNotifRQ xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
|
||||||
|
xmlns="http://www.opentravel.org/OTA/2003/05"
|
||||||
|
Version="4"
|
||||||
|
xsi:schemaLocation="http://www.opentravel.org/OTA/2003/05 OTA_HotelInvCountNotifRQ.xsd">
|
||||||
|
<UniqueID Type="16" ID="1" Instance="CompleteSet"/>
|
||||||
|
<Inventories HotelCode="123" HotelName="Frangart Inn">
|
||||||
|
<!-- ... Inventory elements ... -->
|
||||||
|
</Inventories>
|
||||||
|
</OTA_HotelInvCountNotifRQ>
|
||||||
|
```
|
||||||
|
|
||||||
|
- `Inventories` contains one or more `Inventory` elements, each for a distinct period/room or period/category. Example inner portion:
|
||||||
|
|
||||||
|
```xml
|
||||||
|
<Inventory>
|
||||||
|
<StatusApplicationControl Start="2022-08-01" End="2022-08-10" InvTypeCode="DOUBLE" />
|
||||||
|
<InvCounts>
|
||||||
|
<InvCount CountType="2" Count="3" />
|
||||||
|
</InvCounts>
|
||||||
|
</Inventory>
|
||||||
|
<Inventory>
|
||||||
|
<StatusApplicationControl Start="2022-08-11" End="2022-08-20" InvTypeCode="DOUBLE" />
|
||||||
|
</Inventory>
|
||||||
|
<Inventory>
|
||||||
|
<StatusApplicationControl Start="2022-08-21" End="2022-08-30" InvTypeCode="DOUBLE" />
|
||||||
|
<InvCounts>
|
||||||
|
<InvCount CountType="2" Count="1" />
|
||||||
|
</InvCounts>
|
||||||
|
</Inventory>
|
||||||
|
```
|
||||||
|
|
||||||
|
- Missing `InvCode` means the availability refers to a room category (`InvTypeCode`). Using both `InvTypeCode` and `InvCode` targets a specific room. Matching is case-sensitive. Mixing rooms and categories in one request is not allowed.
|
||||||
|
- `InvCounts` may contain up to three `InvCount` entries (all absolute, not deltas):
|
||||||
|
- `CountType=2`: bookable rooms (must be supported).
|
||||||
|
- `CountType=6`: out of order rooms (requires `OTA_HotelInvCountNotif_accept_out_of_order`).
|
||||||
|
- `CountType=9`: available but not bookable rooms (requires `OTA_HotelInvCountNotif_accept_out_of_market`).
|
||||||
|
- Omitted `InvCount` entries imply `Count=0`. If `InvCounts` is omitted, the room/room category is considered fully booked for the period. `Count` is non-negative; for specific rooms it should be `1`. Sum of counts cannot exceed the total rooms; overbooking is not allowed.
|
||||||
|
- Date ranges are inclusive of the start and end nights (checkout is the morning after `End`). Inventory periods must not overlap for the same room or room category; servers may reject overlaps.
|
||||||
|
|
||||||
|
### CompleteSet
|
||||||
|
|
||||||
|
- Purpose: replace all server-held availability for the hotel with the provided data (e.g., first sync or resync after issues).
|
||||||
|
- Server capability required: `OTA_HotelInvCountNotif_accept_complete_set`.
|
||||||
|
- Indicate a complete set with `UniqueID Instance="CompleteSet" Type="16"` (the `ID` value is ignored). `Type="35"` is also accepted and can be used to hint that data was purged by business rules.
|
||||||
|
- A CompleteSet must list every managed room/room category for all periods the client has on record. Even fully booked periods must be present (with `InvCounts` showing zero or omitted entirely).
|
||||||
|
- To fully reset availability, a CompleteSet may contain a single empty `Inventory` element with no attributes (needed for OTA validation).
|
||||||
|
- Do not include periods for which the client has no data source.
|
||||||
|
|
||||||
|
### Deltas
|
||||||
|
|
||||||
|
- If `UniqueID` is missing, the message is a delta: the server updates only what is present and leaves all other stored data untouched.
|
||||||
|
- Server capability required: `OTA_HotelInvCountNotif_accept_deltas`.
|
||||||
|
- If a delta explicitly covers an entire period, it overwrites the prior state for that period.
|
||||||
|
- AlpineBits recommends periodic full CompleteSet syncs when both sides support them. A server should expose at least one of the delta or complete-set capabilities; without CompleteSet support, obsolete data might require manual cleanup.
|
||||||
|
|
||||||
|
### Closing Seasons
|
||||||
|
|
||||||
|
- Indicates periods when the hotel is closed (distinct from fully booked). Requires both parties to expose `OTA_HotelInvCountNotif_accept_closing_seasons`.
|
||||||
|
- Can only appear as the first `Inventory` elements in a CompleteSet.
|
||||||
|
- Structure: one `StatusApplicationControl` with mandatory `Start`, `End`, and `AllInvCode="true"`; no `InvCounts` allowed. Multiple closing periods are allowed if they do not overlap with each other or with availability periods.
|
||||||
|
- Delta messages supersede earlier closed periods; best practice is to avoid such overlaps or follow deltas with a CompleteSet to restate closures explicitly.
|
||||||
|
|
||||||
|
## 4.1.2 Server Response (`OTA_HotelInvCountNotifRS`)
|
||||||
|
|
||||||
|
- Responses return one of the four AlpineBits outcomes (success, advisory, warning, error). The payload is `OTA_HotelInvCountNotifRS`. See section 2.3 for outcome semantics.
|
||||||
|
|
||||||
|
## 4.1.3 Implementation Tips and Best Practice
|
||||||
|
|
||||||
|
- Support for FreeRooms was mandatory in version 2011-11 but is optional now.
|
||||||
|
- Delta updates were added in 2013-04.
|
||||||
|
- The action was completely rewritten in 2020-10.
|
||||||
|
- Forwarders (e.g., channel managers) must not add data beyond what the source provided; do not extend time frames beyond the most future date received.
|
||||||
|
- For CompleteSet requests, servers are encouraged to delete and reinsert all backend availability rather than perform partial updates.
|
||||||
|
- The `End` date is the last night of stay; departure is the morning after `End`.
|
||||||
|
- Length-of-stay and day-of-arrival restrictions were removed from FreeRooms in 2014-04 (they belong in RatePlans).
|
||||||
|
|
||||||
|
## 4.1.4 Tabular Representation of `OTA_HotelInvCountNotifRQ`
|
||||||
|
|
||||||
|
| Level | Element/Attribute | Type | Cardinality |
|
||||||
|
| --- | --- | --- | --- |
|
||||||
|
| OTA_HotelInvCountNotifRQ | element | | 1 |
|
||||||
|
| OTA_HotelInvCountNotifRQ | Version | | 1 |
|
||||||
|
| OTA_HotelInvCountNotifRQ | UniqueID | element | 0-1 |
|
||||||
|
| UniqueID | Type | enum (16 \| 35) | 1 |
|
||||||
|
| UniqueID | ID | | 1 |
|
||||||
|
| UniqueID | Instance | enum (CompleteSet) | 1 |
|
||||||
|
| OTA_HotelInvCountNotifRQ | Inventories | element | 1 |
|
||||||
|
| Inventories | HotelCode | string(1-16) | 1 |
|
||||||
|
| Inventories | HotelName | string(1-128) | 0-1 |
|
||||||
|
| Inventories | Inventory | element | 1..∞ |
|
||||||
|
| Inventory | StatusApplicationControl | element | 0-1 |
|
||||||
|
| StatusApplicationControl | Start | date (\\S+) | 1 |
|
||||||
|
| StatusApplicationControl | End | date (\\S+) | 1 |
|
||||||
|
| StatusApplicationControl | InvTypeCode | string(1-8) | 0-1 |
|
||||||
|
| StatusApplicationControl | InvCode | string(1-16) | 0-1 |
|
||||||
|
| StatusApplicationControl | AllInvCode | boolean (\\S+) | 0-1 |
|
||||||
|
| Inventory | InvCounts | element | 0-1 |
|
||||||
|
| InvCounts | InvCount | element | 1-3 |
|
||||||
|
| InvCount | CountType | enum (2 \| 6 \| 9) | 1 |
|
||||||
|
| InvCount | Count | integer ([0-9]+) | 1 |
|
||||||
|
|
||||||
|
## 4.1.5 Tabular Representation of `OTA_HotelInvCountNotifRS`
|
||||||
|
|
||||||
|
| Level | Element/Attribute | Type | Cardinality |
|
||||||
|
| --- | --- | --- | --- |
|
||||||
|
| OTA_HotelInvCountNotifRS | element | | 1 |
|
||||||
|
| OTA_HotelInvCountNotifRS | Version | | 1 |
|
||||||
|
| OTA_HotelInvCountNotifRS | TimeStamp | | 0-1 |
|
||||||
|
| OTA_HotelInvCountNotifRS | Success | element (choice start) | 1 |
|
||||||
|
| OTA_HotelInvCountNotifRS | Warnings | element (choice start) | 0-1 |
|
||||||
|
| Warnings | Warning | element | 1..∞ |
|
||||||
|
| Warning | Type | integer ([0-9]+) | 1 |
|
||||||
|
| Warning | RecordID | string(1-64) | 0-1 |
|
||||||
|
| Warning | Status | enum (ALPINEBITS_SEND_HANDSHAKE \| ALPINEBITS_SEND_FREEROOMS \| ALPINEBITS_SEND_RATEPLANS \| ALPINEBITS_SEND_INVENTORY) | 0-1 |
|
||||||
|
| OTA_HotelInvCountNotifRS | Errors | element (choice end) | 1 |
|
||||||
|
| Errors | Error | element | 1..∞ |
|
||||||
|
| Error | Type | enum (11 \| 13) | 1 |
|
||||||
|
| Error | Code | integer ([0-9]+) | 0-1 |
|
||||||
|
| Error | Status | enum (ALPINEBITS_SEND_HANDSHAKE \| ALPINEBITS_SEND_FREEROOMS \| ALPINEBITS_SEND_RATEPLANS \| ALPINEBITS_SEND_INVENTORY) | 0-1 |
|
||||||
|
|
||||||
33
docs/alpinebits_docs/chapter4/4_data_exchange_actions.md
Normal file
33
docs/alpinebits_docs/chapter4/4_data_exchange_actions.md
Normal file
@@ -0,0 +1,33 @@
|
|||||||
|
# Chapter 4 - Data Exchange Actions
|
||||||
|
|
||||||
|
These actions define how clients and servers exchange hotel data. For every data exchange request both `action` and `request` parameters are mandatory, and the XML payloads must validate against OTA2015A plus the stricter AlpineBits schema.
|
||||||
|
|
||||||
|
## Action Summary
|
||||||
|
|
||||||
|
| Known as (since) | Usage | Action parameter | Request XML | Response XML |
|
||||||
|
| --- | --- | --- | --- | --- |
|
||||||
|
| FreeRooms (2011-11) | Client sends room availability notifications | `OTA_HotelInvCountNotif:FreeRooms` | `OTA_HotelInvCountNotifRQ` | `OTA_HotelInvCountNotifRS` |
|
||||||
|
| GuestRequests (2012-05) | Client asks server for quote/booking requests | `OTA_Read:GuestRequests` | `OTA_ReadRQ` | `OTA_ResRetrieveRS` |
|
||||||
|
| GuestRequests Push (2018-10) | Client pushes quote/booking requests to server | `OTA_HotelResNotif:GuestRequests` | `OTA_HotelResNotifRQ` | `OTA_HotelResNotifRS` |
|
||||||
|
| GuestRequests Status Update Push (2022-10) | Client sends status updates for quote/booking requests | `OTA_HotelResNotif:GuestRequests_StatusUpdate` | `OTA_HotelResNotifRQ` | `OTA_HotelResNotifRS` |
|
||||||
|
| GuestRequests Acknowledgments (2014-04) | Client acknowledges requests it received | `OTA_NotifReport:GuestRequests` | `OTA_NotifReportRQ` | `OTA_NotifReportRS` |
|
||||||
|
| Inventory/Basic Push (2015-07) | Client sends room category info and room lists | `OTA_HotelDescriptiveContentNotif:Inventory` | `OTA_HotelDescriptiveContentNotifRQ` | `OTA_HotelDescriptiveContentNotifRS` |
|
||||||
|
| Inventory/Basic Pull (2017-10) | Client requests room category info and room lists | `OTA_HotelDescriptiveInfo:Inventory` | `OTA_HotelDescriptiveInfoRQ` | `OTA_HotelDescriptiveInfoRS` |
|
||||||
|
| Inventory/HotelInfo Push (2015-07) | Client sends additional property descriptive content | `OTA_HotelDescriptiveContentNotif:Info` | `OTA_HotelDescriptiveContentNotifRQ` | `OTA_HotelDescriptiveContentNotifRS` |
|
||||||
|
| Inventory/HotelInfo Pull (2017-10) | Client requests additional property descriptive content | `OTA_HotelDescriptiveInfo:Info` | `OTA_HotelDescriptiveInfoRQ` | `OTA_HotelDescriptiveInfoRS` |
|
||||||
|
| RatePlans (2014-04) | Client sends rate plans with prices and booking rules | `OTA_HotelRatePlanNotif:RatePlans` | `OTA_HotelRatePlanNotifRQ` | `OTA_HotelRatePlanNotifRS` |
|
||||||
|
| BaseRates (2017-10) | Client requests rate plan information | `OTA_HotelRatePlan:BaseRates` | `OTA_HotelRatePlanRQ` | `OTA_HotelRatePlanRS` |
|
||||||
|
| Activities (2020-10) | Client requests hotel activity information | `OTA_HotelPostEventNotif:EventReports` | `OTA_HotelPostEventNotifRQ` | `OTA_HotelPostEventNotifRS` |
|
||||||
|
|
||||||
|
## Encoding and Schema Requirements
|
||||||
|
|
||||||
|
- All XML documents must be UTF-8 encoded. Expect arbitrary Unicode (including emojis or non-Latin characters); validate and sanitize before storage to avoid visualization or data corruption issues.
|
||||||
|
- Requests and responses must validate against OTA2015A. The AlpineBits schema provided in the documentation kit is stricter: every document that passes AlpineBits validation also passes OTA2015A, not vice versa.
|
||||||
|
- Sample XML files and the stricter XSD are included in the AlpineBits documentation kit for each protocol version.
|
||||||
|
- Currency codes follow ISO 4217 (EUR shown in samples but any ISO code is allowed). If a server receives an unsupported currency it must reply with a warning outcome; a client should discard responses using unsupported currencies.
|
||||||
|
|
||||||
|
## Copyright and Licensing of Multimedia Content
|
||||||
|
|
||||||
|
- Many messages carry URLs to multimedia objects. Since XML has no place for license data, AlpineBits recommends embedding licensing metadata (e.g., IPTC/EXIF for images) in the files themselves and preserving it in derived works.
|
||||||
|
- Alternatively (or additionally), include HTTP headers `X-AlpineBits-License` and `X-AlpineBits-CopyrightHolder` when serving multimedia content. Receivers should honor and propagate these headers to derived assets.
|
||||||
|
|
||||||
1329910
meta_insights_dump2025_11_27.sql
Normal file
1329910
meta_insights_dump2025_11_27.sql
Normal file
File diff suppressed because one or more lines are too long
@@ -13,6 +13,7 @@ dependencies = [
|
|||||||
"alembic>=1.17.2",
|
"alembic>=1.17.2",
|
||||||
"annotatedyaml>=1.0.0",
|
"annotatedyaml>=1.0.0",
|
||||||
"asyncpg>=0.30.0",
|
"asyncpg>=0.30.0",
|
||||||
|
"bcrypt>=5.0.0",
|
||||||
"dotenv>=0.9.9",
|
"dotenv>=0.9.9",
|
||||||
"fast-langdetect>=1.0.0",
|
"fast-langdetect>=1.0.0",
|
||||||
"fastapi>=0.117.1",
|
"fastapi>=0.117.1",
|
||||||
|
|||||||
@@ -31,6 +31,23 @@ select res.created_at,directly_attributable ,con.reservation_date, res.start_dat
|
|||||||
order by reservation_date;
|
order by reservation_date;
|
||||||
|
|
||||||
|
|
||||||
|
```
|
||||||
|
Um zu schaugn wie viele schon bearbeitet wurden und als Anfragen in ASA drins sein
|
||||||
|
```
|
||||||
|
select res.id, res.created_at, con.created_at as "Con Created at", con.updated_at as "Con Updated at", given_name, surname, guest_first_name, guest_last_name,
|
||||||
|
meta_account_id, google_account_id, con.id
|
||||||
|
|
||||||
|
from alpinebits.reservations as res
|
||||||
|
join alpinebits.customers as cus on res.customer_id = cus.id
|
||||||
|
left join alpinebits.conversions as con on con.reservation_id = res.id
|
||||||
|
left join alpinebits.conversion_guests as g on g.guest_id = con.guest_id
|
||||||
|
|
||||||
|
where hotel_code = '39054_001'
|
||||||
|
|
||||||
|
|
||||||
|
order by res.created_at desc limit 400
|
||||||
|
|
||||||
|
|
||||||
```
|
```
|
||||||
|
|
||||||
```
|
```
|
||||||
|
|||||||
@@ -86,6 +86,10 @@ class AlpineBitsActionName(Enum):
|
|||||||
"action_OTA_HotelRatePlan_BaseRates",
|
"action_OTA_HotelRatePlan_BaseRates",
|
||||||
"OTA_HotelRatePlan:BaseRates",
|
"OTA_HotelRatePlan:BaseRates",
|
||||||
)
|
)
|
||||||
|
OTA_HOTEL_INV_COUNT_NOTIF_FREE_ROOMS = (
|
||||||
|
"action_OTA_HotelInvCountNotif",
|
||||||
|
"OTA_HotelInvCountNotif:FreeRooms",
|
||||||
|
)
|
||||||
|
|
||||||
def __init__(self, capability_name: str, request_name: str):
|
def __init__(self, capability_name: str, request_name: str):
|
||||||
self.capability_name = capability_name
|
self.capability_name = capability_name
|
||||||
@@ -819,3 +823,7 @@ class AlpineBitsServer:
|
|||||||
return False
|
return False
|
||||||
|
|
||||||
return True
|
return True
|
||||||
|
|
||||||
|
|
||||||
|
# Ensure FreeRoomsAction is registered with ServerCapabilities discovery
|
||||||
|
#from .free_rooms_action import FreeRoomsAction # noqa: E402,F401 disable for now
|
||||||
|
|||||||
File diff suppressed because it is too large
Load Diff
@@ -1,5 +1,6 @@
|
|||||||
import os
|
import os
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
|
from typing import Any
|
||||||
|
|
||||||
from annotatedyaml.loader import Secrets
|
from annotatedyaml.loader import Secrets
|
||||||
from annotatedyaml.loader import load_yaml as load_annotated_yaml
|
from annotatedyaml.loader import load_yaml as load_annotated_yaml
|
||||||
@@ -334,3 +335,42 @@ def load_config():
|
|||||||
def get_username_for_hotel(config: dict, hotel_code: str) -> str:
|
def get_username_for_hotel(config: dict, hotel_code: str) -> str:
|
||||||
"""Get the username associated with a hotel_code from config."""
|
"""Get the username associated with a hotel_code from config."""
|
||||||
return next(h.get("username") for h in config.get("alpine_bits_auth", []) if h.get("hotel_id") == hotel_code)
|
return next(h.get("username") for h in config.get("alpine_bits_auth", []) if h.get("hotel_id") == hotel_code)
|
||||||
|
|
||||||
|
|
||||||
|
def get_advertising_account_ids(
|
||||||
|
config: dict[str, Any], hotel_code: str, fbclid: str | None, gclid: str | None
|
||||||
|
) -> tuple[str | None, str | None]:
|
||||||
|
"""Get advertising account IDs based on hotel config and click IDs.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
config: Application configuration dict
|
||||||
|
hotel_code: Hotel identifier to look up in config
|
||||||
|
fbclid: Facebook click ID (if present, meta_account_id will be returned)
|
||||||
|
gclid: Google click ID (if present, google_account_id will be returned)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Tuple of (meta_account_id, google_account_id) based on conditional logic:
|
||||||
|
- meta_account_id is set only if fbclid is present AND hotel has
|
||||||
|
meta_account configured
|
||||||
|
- google_account_id is set only if gclid is present AND hotel has
|
||||||
|
google_account configured
|
||||||
|
|
||||||
|
"""
|
||||||
|
meta_account_id = None
|
||||||
|
google_account_id = None
|
||||||
|
|
||||||
|
# Look up hotel in config
|
||||||
|
alpine_bits_auth = config.get("alpine_bits_auth", [])
|
||||||
|
for hotel in alpine_bits_auth:
|
||||||
|
if hotel.get(CONF_HOTEL_ID) == hotel_code:
|
||||||
|
# Conditionally set meta_account_id if fbclid is present
|
||||||
|
if fbclid:
|
||||||
|
meta_account_id = hotel.get(CONF_META_ACCOUNT)
|
||||||
|
|
||||||
|
# Conditionally set google_account_id if gclid is present
|
||||||
|
if gclid:
|
||||||
|
google_account_id = hotel.get(CONF_GOOGLE_ACCOUNT)
|
||||||
|
|
||||||
|
break
|
||||||
|
|
||||||
|
return meta_account_id, google_account_id
|
||||||
|
|||||||
@@ -1,7 +1,16 @@
|
|||||||
from enum import IntEnum
|
from enum import IntEnum, StrEnum
|
||||||
from typing import Final
|
from typing import Final
|
||||||
|
|
||||||
|
|
||||||
|
class WebhookStatus(StrEnum):
|
||||||
|
"""Allowed webhook processing statuses for AlpineBits."""
|
||||||
|
|
||||||
|
PROCESSING = "processing"
|
||||||
|
COMPLETED = "completed"
|
||||||
|
FAILED = "failed"
|
||||||
|
PENDING = "pending"
|
||||||
|
|
||||||
|
|
||||||
class HttpStatusCode(IntEnum):
|
class HttpStatusCode(IntEnum):
|
||||||
"""Allowed HTTP status codes for AlpineBits responses."""
|
"""Allowed HTTP status codes for AlpineBits responses."""
|
||||||
|
|
||||||
|
|||||||
@@ -731,17 +731,10 @@ class ConversionService:
|
|||||||
# Flush to ensure conversion has an ID before creating room reservations
|
# Flush to ensure conversion has an ID before creating room reservations
|
||||||
await session.flush()
|
await session.flush()
|
||||||
|
|
||||||
# Batch-load existing room reservations to avoid N+1 queries
|
# Fetch ALL existing rooms for this conversion (not just the ones in current XML)
|
||||||
room_numbers = [
|
|
||||||
rm.get("roomNumber") for rm in room_reservations.findall("roomReservation")
|
|
||||||
]
|
|
||||||
pms_hotel_reservation_ids = [
|
|
||||||
f"{pms_reservation_id}_{room_num}" for room_num in room_numbers
|
|
||||||
]
|
|
||||||
|
|
||||||
existing_rooms_result = await session.execute(
|
existing_rooms_result = await session.execute(
|
||||||
select(ConversionRoom).where(
|
select(ConversionRoom).where(
|
||||||
ConversionRoom.pms_hotel_reservation_id.in_(pms_hotel_reservation_ids)
|
ConversionRoom.conversion_id == conversion.id
|
||||||
)
|
)
|
||||||
)
|
)
|
||||||
existing_rooms = {
|
existing_rooms = {
|
||||||
@@ -749,6 +742,9 @@ class ConversionService:
|
|||||||
for room in existing_rooms_result.scalars().all()
|
for room in existing_rooms_result.scalars().all()
|
||||||
}
|
}
|
||||||
|
|
||||||
|
# Track which room IDs are present in the current XML
|
||||||
|
current_pms_hotel_reservation_ids = set()
|
||||||
|
|
||||||
# Process room reservations
|
# Process room reservations
|
||||||
for room_reservation in room_reservations.findall("roomReservation"):
|
for room_reservation in room_reservations.findall("roomReservation"):
|
||||||
# Extract room reservation details
|
# Extract room reservation details
|
||||||
@@ -786,6 +782,9 @@ class ConversionService:
|
|||||||
# This allows updating the same room reservation if it appears again
|
# This allows updating the same room reservation if it appears again
|
||||||
pms_hotel_reservation_id = f"{pms_reservation_id}_{room_number}"
|
pms_hotel_reservation_id = f"{pms_reservation_id}_{room_number}"
|
||||||
|
|
||||||
|
# Track this room as present in current XML
|
||||||
|
current_pms_hotel_reservation_ids.add(pms_hotel_reservation_id)
|
||||||
|
|
||||||
# Process daily sales and extract total revenue
|
# Process daily sales and extract total revenue
|
||||||
daily_sales_elem = room_reservation.find("dailySales")
|
daily_sales_elem = room_reservation.find("dailySales")
|
||||||
daily_sales_list = []
|
daily_sales_list = []
|
||||||
@@ -880,6 +879,24 @@ class ConversionService:
|
|||||||
num_adults,
|
num_adults,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
# Delete room entries that are no longer present in the current XML
|
||||||
|
# This handles cases where a reservation is updated and room numbers change
|
||||||
|
rooms_to_delete = [
|
||||||
|
room
|
||||||
|
for pms_id, room in existing_rooms.items()
|
||||||
|
if pms_id not in current_pms_hotel_reservation_ids
|
||||||
|
]
|
||||||
|
|
||||||
|
if rooms_to_delete:
|
||||||
|
for room in rooms_to_delete:
|
||||||
|
await session.delete(room)
|
||||||
|
_LOGGER.debug(
|
||||||
|
"Deleted room reservation %s (pms_id=%s, room=%s) - no longer in current XML",
|
||||||
|
room.id,
|
||||||
|
room.pms_hotel_reservation_id,
|
||||||
|
room.room_number,
|
||||||
|
)
|
||||||
|
|
||||||
return stats
|
return stats
|
||||||
|
|
||||||
|
|
||||||
|
|||||||
@@ -4,6 +4,8 @@ import os
|
|||||||
from collections.abc import AsyncGenerator, Callable
|
from collections.abc import AsyncGenerator, Callable
|
||||||
from typing import TypeVar
|
from typing import TypeVar
|
||||||
|
|
||||||
|
from .const import WebhookStatus
|
||||||
|
|
||||||
from sqlalchemy import (
|
from sqlalchemy import (
|
||||||
JSON,
|
JSON,
|
||||||
Boolean,
|
Boolean,
|
||||||
@@ -13,8 +15,11 @@ from sqlalchemy import (
|
|||||||
Double,
|
Double,
|
||||||
ForeignKey,
|
ForeignKey,
|
||||||
ForeignKeyConstraint,
|
ForeignKeyConstraint,
|
||||||
|
Index,
|
||||||
Integer,
|
Integer,
|
||||||
String,
|
String,
|
||||||
|
UniqueConstraint,
|
||||||
|
func,
|
||||||
)
|
)
|
||||||
from sqlalchemy.exc import DBAPIError
|
from sqlalchemy.exc import DBAPIError
|
||||||
from sqlalchemy.ext.asyncio import (
|
from sqlalchemy.ext.asyncio import (
|
||||||
@@ -674,3 +679,176 @@ class ConversionRoom(Base):
|
|||||||
|
|
||||||
# Relationships
|
# Relationships
|
||||||
conversion = relationship("Conversion", back_populates="conversion_rooms")
|
conversion = relationship("Conversion", back_populates="conversion_rooms")
|
||||||
|
|
||||||
|
|
||||||
|
class HotelInventory(Base):
|
||||||
|
"""Room and category definitions synchronized via AlpineBits."""
|
||||||
|
|
||||||
|
__tablename__ = "hotel_inventory"
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True)
|
||||||
|
hotel_id = Column(
|
||||||
|
String(50), ForeignKey("hotels.hotel_id", ondelete="CASCADE"), nullable=False, index=True
|
||||||
|
)
|
||||||
|
inv_type_code = Column(String(8), nullable=False, index=True)
|
||||||
|
inv_code = Column(String(16), nullable=True, index=True)
|
||||||
|
room_name = Column(String(200), nullable=True)
|
||||||
|
max_occupancy = Column(Integer, nullable=True)
|
||||||
|
source = Column(String(20), nullable=False)
|
||||||
|
first_seen = Column(DateTime(timezone=True), nullable=False)
|
||||||
|
last_updated = Column(DateTime(timezone=True), nullable=False)
|
||||||
|
|
||||||
|
hotel = relationship("Hotel", back_populates="inventory_items")
|
||||||
|
availability = relationship(
|
||||||
|
"RoomAvailability",
|
||||||
|
back_populates="inventory_item",
|
||||||
|
cascade="all, delete-orphan",
|
||||||
|
passive_deletes=True,
|
||||||
|
)
|
||||||
|
|
||||||
|
__table_args__ = (
|
||||||
|
Index(
|
||||||
|
"uq_hotel_inventory_unique_key",
|
||||||
|
"hotel_id",
|
||||||
|
"inv_type_code",
|
||||||
|
func.coalesce(inv_code, ""),
|
||||||
|
unique=True,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class RoomAvailability(Base):
|
||||||
|
"""Daily availability counts for inventory items."""
|
||||||
|
|
||||||
|
__tablename__ = "room_availability"
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True)
|
||||||
|
inventory_id = Column(
|
||||||
|
Integer, ForeignKey("hotel_inventory.id", ondelete="CASCADE"), nullable=False, index=True
|
||||||
|
)
|
||||||
|
date = Column(Date, nullable=False, index=True)
|
||||||
|
count_type_2 = Column(Integer, nullable=True)
|
||||||
|
count_type_6 = Column(Integer, nullable=True)
|
||||||
|
count_type_9 = Column(Integer, nullable=True)
|
||||||
|
is_closing_season = Column(Boolean, nullable=False, default=False)
|
||||||
|
last_updated = Column(DateTime(timezone=True), nullable=False)
|
||||||
|
update_type = Column(String(20), nullable=False)
|
||||||
|
|
||||||
|
inventory_item = relationship("HotelInventory", back_populates="availability")
|
||||||
|
|
||||||
|
__table_args__ = (
|
||||||
|
UniqueConstraint("inventory_id", "date", name="uq_room_availability_unique_key"),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class Hotel(Base):
|
||||||
|
"""Hotel configuration (migrated from alpine_bits_auth in config.yaml)."""
|
||||||
|
|
||||||
|
__tablename__ = "hotels"
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True)
|
||||||
|
|
||||||
|
# Core identification
|
||||||
|
hotel_id = Column(String(50), unique=True, nullable=False, index=True)
|
||||||
|
hotel_name = Column(String(200), nullable=False)
|
||||||
|
|
||||||
|
# AlpineBits authentication
|
||||||
|
username = Column(String(100), unique=True, nullable=False, index=True)
|
||||||
|
password_hash = Column(String(200), nullable=False) # bcrypt
|
||||||
|
|
||||||
|
# Advertising accounts
|
||||||
|
meta_account_id = Column(String(50), nullable=True)
|
||||||
|
google_account_id = Column(String(50), nullable=True)
|
||||||
|
|
||||||
|
# Push endpoint (optional)
|
||||||
|
push_endpoint_url = Column(String(500), nullable=True)
|
||||||
|
push_endpoint_token = Column(String(200), nullable=True)
|
||||||
|
push_endpoint_username = Column(String(100), nullable=True)
|
||||||
|
|
||||||
|
# Metadata
|
||||||
|
created_at = Column(DateTime(timezone=True), nullable=False)
|
||||||
|
updated_at = Column(DateTime(timezone=True), nullable=False)
|
||||||
|
is_active = Column(Boolean, default=True, nullable=False, index=True)
|
||||||
|
|
||||||
|
# Relationships
|
||||||
|
webhook_endpoints = relationship("WebhookEndpoint", back_populates="hotel")
|
||||||
|
inventory_items = relationship(
|
||||||
|
"HotelInventory", back_populates="hotel", cascade="all, delete-orphan"
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class WebhookEndpoint(Base):
|
||||||
|
"""Webhook configurations per hotel (supports multiple webhook types per hotel)."""
|
||||||
|
|
||||||
|
__tablename__ = "webhook_endpoints"
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True)
|
||||||
|
|
||||||
|
# Hotel association
|
||||||
|
hotel_id = Column(String(50), ForeignKey("hotels.hotel_id"), nullable=False, index=True)
|
||||||
|
|
||||||
|
# Webhook configuration
|
||||||
|
webhook_secret = Column(String(64), unique=True, nullable=False, index=True)
|
||||||
|
webhook_type = Column(String(50), nullable=False) # 'wix_form', 'generic', etc.
|
||||||
|
|
||||||
|
# Metadata
|
||||||
|
description = Column(String(200), nullable=True) # Human-readable label
|
||||||
|
is_enabled = Column(Boolean, default=True, nullable=False)
|
||||||
|
created_at = Column(DateTime(timezone=True), nullable=False)
|
||||||
|
|
||||||
|
# Relationships
|
||||||
|
hotel = relationship("Hotel", back_populates="webhook_endpoints")
|
||||||
|
webhook_requests = relationship("WebhookRequest", back_populates="webhook_endpoint")
|
||||||
|
|
||||||
|
__table_args__ = (
|
||||||
|
Index('idx_webhook_endpoint_hotel_type', 'hotel_id', 'webhook_type'),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class WebhookRequest(Base):
|
||||||
|
"""Tracks incoming webhooks for deduplication and retry handling."""
|
||||||
|
|
||||||
|
__tablename__ = "webhook_requests"
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True)
|
||||||
|
|
||||||
|
# Request identification
|
||||||
|
payload_hash = Column(String(64), unique=True, nullable=False, index=True) # SHA256
|
||||||
|
webhook_endpoint_id = Column(Integer, ForeignKey("webhook_endpoints.id"), nullable=True, index=True)
|
||||||
|
hotel_id = Column(String(50), ForeignKey("hotels.hotel_id"), nullable=True, index=True)
|
||||||
|
|
||||||
|
# Processing tracking
|
||||||
|
status = Column(String(20), nullable=False, default=WebhookStatus.PENDING.value, index=True)
|
||||||
|
# Status values: 'pending', 'processing', 'completed', 'failed' set by Enum WebhookStatus
|
||||||
|
|
||||||
|
processing_started_at = Column(DateTime(timezone=True), nullable=True)
|
||||||
|
processing_completed_at = Column(DateTime(timezone=True), nullable=True)
|
||||||
|
|
||||||
|
# Retry handling
|
||||||
|
retry_count = Column(Integer, default=0)
|
||||||
|
last_error = Column(String(2000), nullable=True)
|
||||||
|
|
||||||
|
# Payload storage
|
||||||
|
payload_json = Column(JSON, nullable=True) # NULL after purge, kept for retries
|
||||||
|
purged_at = Column(DateTime(timezone=True), nullable=True) # When JSON was purged
|
||||||
|
|
||||||
|
# Metadata
|
||||||
|
created_at = Column(DateTime(timezone=True), nullable=False, index=True)
|
||||||
|
source_ip = Column(String(45), nullable=True)
|
||||||
|
user_agent = Column(String(500), nullable=True)
|
||||||
|
|
||||||
|
# Result tracking
|
||||||
|
created_customer_id = Column(Integer, ForeignKey("customers.id"), nullable=True)
|
||||||
|
created_reservation_id = Column(Integer, ForeignKey("reservations.id"), nullable=True)
|
||||||
|
|
||||||
|
# Relationships
|
||||||
|
webhook_endpoint = relationship("WebhookEndpoint", back_populates="webhook_requests")
|
||||||
|
hotel = relationship("Hotel")
|
||||||
|
customer = relationship("Customer")
|
||||||
|
reservation = relationship("Reservation")
|
||||||
|
|
||||||
|
__table_args__ = (
|
||||||
|
Index('idx_webhook_status_created', 'status', 'created_at'),
|
||||||
|
Index('idx_webhook_hotel_created', 'hotel_id', 'created_at'),
|
||||||
|
Index('idx_webhook_purge_candidate', 'status', 'purged_at', 'created_at'),
|
||||||
|
)
|
||||||
|
|||||||
@@ -7,15 +7,18 @@ before the application starts accepting requests. It includes:
|
|||||||
"""
|
"""
|
||||||
|
|
||||||
import asyncio
|
import asyncio
|
||||||
|
from datetime import UTC, datetime
|
||||||
from typing import Any
|
from typing import Any
|
||||||
|
|
||||||
from sqlalchemy import text
|
from sqlalchemy import select, text
|
||||||
from sqlalchemy.ext.asyncio import AsyncEngine, async_sessionmaker
|
from sqlalchemy.ext.asyncio import AsyncEngine, async_sessionmaker
|
||||||
|
from sqlalchemy.orm import selectinload
|
||||||
|
|
||||||
from .const import CONF_GOOGLE_ACCOUNT, CONF_HOTEL_ID, CONF_META_ACCOUNT
|
from .const import CONF_GOOGLE_ACCOUNT, CONF_HOTEL_ID, CONF_META_ACCOUNT, WebhookStatus
|
||||||
from .customer_service import CustomerService
|
from .customer_service import CustomerService
|
||||||
from .db import create_database_engine
|
from .db import WebhookEndpoint, WebhookRequest, create_database_engine
|
||||||
from .logging_config import get_logger
|
from .logging_config import get_logger
|
||||||
|
from .webhook_processor import webhook_registry
|
||||||
|
|
||||||
_LOGGER = get_logger(__name__)
|
_LOGGER = get_logger(__name__)
|
||||||
|
|
||||||
@@ -236,6 +239,156 @@ async def backfill_acked_requests_username(
|
|||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
|
async def reprocess_stuck_webhooks(
|
||||||
|
sessionmaker: async_sessionmaker,
|
||||||
|
config: dict[str, Any] | None = None,
|
||||||
|
) -> None:
|
||||||
|
"""Reprocess webhooks that were stuck in 'processing' state.
|
||||||
|
|
||||||
|
Finds webhooks with status='processing' and reprocesses them.
|
||||||
|
These are webhooks that were not fully processed in the previous run,
|
||||||
|
likely due to a crash or unexpected shutdown.
|
||||||
|
|
||||||
|
This function is designed to NEVER block application startup.
|
||||||
|
All errors are caught and logged, but the app will start regardless.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
sessionmaker: SQLAlchemy async sessionmaker
|
||||||
|
config: Application configuration dictionary
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
_LOGGER.info("Checking for stuck webhooks to reprocess...")
|
||||||
|
|
||||||
|
async with sessionmaker() as session:
|
||||||
|
# Find all webhooks stuck in 'processing' state
|
||||||
|
result = await session.execute(
|
||||||
|
select(WebhookRequest)
|
||||||
|
.where(WebhookRequest.status == WebhookStatus.PROCESSING)
|
||||||
|
.options(
|
||||||
|
selectinload(WebhookRequest.webhook_endpoint).selectinload(
|
||||||
|
WebhookEndpoint.hotel
|
||||||
|
)
|
||||||
|
)
|
||||||
|
)
|
||||||
|
stuck_webhooks: list[WebhookRequest] = result.scalars().all()
|
||||||
|
|
||||||
|
if not stuck_webhooks:
|
||||||
|
_LOGGER.info("No stuck webhooks found")
|
||||||
|
return
|
||||||
|
|
||||||
|
_LOGGER.info("Found %d stuck webhooks to reprocess", len(stuck_webhooks))
|
||||||
|
|
||||||
|
reprocessed_count = 0
|
||||||
|
failed_count = 0
|
||||||
|
|
||||||
|
for webhook_request in stuck_webhooks:
|
||||||
|
webhook_id = webhook_request.id
|
||||||
|
webhook_endpoint = webhook_request.webhook_endpoint
|
||||||
|
|
||||||
|
if not webhook_endpoint:
|
||||||
|
_LOGGER.error(
|
||||||
|
"Webhook request %d has no webhook_endpoint, skipping", webhook_id
|
||||||
|
)
|
||||||
|
webhook_request.status = WebhookStatus.FAILED
|
||||||
|
webhook_request.last_error = (
|
||||||
|
"No webhook endpoint found during startup reprocessing"
|
||||||
|
)
|
||||||
|
webhook_request.processing_completed_at = datetime.now(UTC)
|
||||||
|
failed_count += 1
|
||||||
|
continue
|
||||||
|
|
||||||
|
if not webhook_request.payload_json:
|
||||||
|
_LOGGER.error(
|
||||||
|
"Webhook request %d has no payload (purged?), marking as failed",
|
||||||
|
webhook_id,
|
||||||
|
)
|
||||||
|
webhook_request.status = WebhookStatus.FAILED
|
||||||
|
webhook_request.last_error = (
|
||||||
|
"No payload available for reprocessing (purged)"
|
||||||
|
)
|
||||||
|
webhook_request.processing_completed_at = datetime.now(UTC)
|
||||||
|
failed_count += 1
|
||||||
|
continue
|
||||||
|
|
||||||
|
try:
|
||||||
|
_LOGGER.info(
|
||||||
|
"Reprocessing webhook %d (hotel=%s, type=%s)",
|
||||||
|
webhook_id,
|
||||||
|
webhook_endpoint.hotel_id,
|
||||||
|
webhook_endpoint.webhook_type,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Get processor for webhook_type
|
||||||
|
processor = webhook_registry.get_processor(
|
||||||
|
webhook_endpoint.webhook_type
|
||||||
|
)
|
||||||
|
if not processor:
|
||||||
|
raise ValueError(
|
||||||
|
f"No processor for type: {webhook_endpoint.webhook_type}"
|
||||||
|
)
|
||||||
|
|
||||||
|
# Reprocess webhook with simplified interface
|
||||||
|
result = await processor.process(
|
||||||
|
webhook_request=webhook_request,
|
||||||
|
db_session=session,
|
||||||
|
config=config,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Check result status
|
||||||
|
result_status = result.get("status") if isinstance(result, dict) else "success"
|
||||||
|
|
||||||
|
if result_status == "duplicate":
|
||||||
|
# Duplicate is not an error - mark as completed and continue
|
||||||
|
webhook_request.status = WebhookStatus.COMPLETED
|
||||||
|
webhook_request.processing_completed_at = datetime.now(UTC)
|
||||||
|
reprocessed_count += 1
|
||||||
|
_LOGGER.info(
|
||||||
|
"Webhook %d was a duplicate (already processed), marked as completed",
|
||||||
|
webhook_id
|
||||||
|
)
|
||||||
|
elif result_status in ("success", "completed"):
|
||||||
|
# Update status to completed
|
||||||
|
webhook_request.status = WebhookStatus.COMPLETED
|
||||||
|
webhook_request.processing_completed_at = datetime.now(UTC)
|
||||||
|
reprocessed_count += 1
|
||||||
|
_LOGGER.info("Successfully reprocessed webhook %d", webhook_id)
|
||||||
|
else:
|
||||||
|
# Unexpected status - treat as failure
|
||||||
|
_LOGGER.warning(
|
||||||
|
"Webhook %d returned unexpected status: %s",
|
||||||
|
webhook_id,
|
||||||
|
result_status
|
||||||
|
)
|
||||||
|
webhook_request.status = WebhookStatus.FAILED
|
||||||
|
webhook_request.last_error = f"Unexpected status: {result_status}"
|
||||||
|
webhook_request.processing_completed_at = datetime.now(UTC)
|
||||||
|
failed_count += 1
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
_LOGGER.exception("Failed to reprocess webhook %d: %s", webhook_id, e)
|
||||||
|
webhook_request.status = WebhookStatus.FAILED
|
||||||
|
webhook_request.last_error = (
|
||||||
|
f"Reprocessing failed during startup: {str(e)[:1950]}"
|
||||||
|
)
|
||||||
|
webhook_request.processing_completed_at = datetime.now(UTC)
|
||||||
|
failed_count += 1
|
||||||
|
|
||||||
|
# Commit all changes
|
||||||
|
await session.commit()
|
||||||
|
|
||||||
|
_LOGGER.info(
|
||||||
|
"Webhook reprocessing complete: %d successful, %d failed",
|
||||||
|
reprocessed_count,
|
||||||
|
failed_count,
|
||||||
|
)
|
||||||
|
except Exception as e:
|
||||||
|
# CRITICAL: Never let reprocessing block application startup
|
||||||
|
_LOGGER.exception(
|
||||||
|
"CRITICAL ERROR during webhook reprocessing, but allowing app to start: %s",
|
||||||
|
e
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
async def run_startup_tasks(
|
async def run_startup_tasks(
|
||||||
sessionmaker: async_sessionmaker,
|
sessionmaker: async_sessionmaker,
|
||||||
config: dict[str, Any] | None = None,
|
config: dict[str, Any] | None = None,
|
||||||
@@ -251,6 +404,18 @@ async def run_startup_tasks(
|
|||||||
config: Application configuration dictionary
|
config: Application configuration dictionary
|
||||||
engine: SQLAlchemy async engine (optional, for backfill tasks)
|
engine: SQLAlchemy async engine (optional, for backfill tasks)
|
||||||
"""
|
"""
|
||||||
|
# Sync config to database (hotels and webhook endpoints)
|
||||||
|
if config:
|
||||||
|
from .hotel_service import sync_config_to_database
|
||||||
|
async with sessionmaker() as session:
|
||||||
|
stats = await sync_config_to_database(session, config)
|
||||||
|
_LOGGER.info(
|
||||||
|
"Config sync: %d hotels created, %d updated, %d endpoints created",
|
||||||
|
stats["hotels_created"],
|
||||||
|
stats["hotels_updated"],
|
||||||
|
stats["endpoints_created"]
|
||||||
|
)
|
||||||
|
|
||||||
# Hash any existing customers that don't have hashed data
|
# Hash any existing customers that don't have hashed data
|
||||||
async with sessionmaker() as session:
|
async with sessionmaker() as session:
|
||||||
customer_service = CustomerService(session)
|
customer_service = CustomerService(session)
|
||||||
@@ -272,3 +437,6 @@ async def run_startup_tasks(
|
|||||||
"No engine provided to run_startup_tasks, "
|
"No engine provided to run_startup_tasks, "
|
||||||
"skipping config-based backfill tasks"
|
"skipping config-based backfill tasks"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
# Reprocess stuck webhooks (those stuck in 'processing' state)
|
||||||
|
await reprocess_stuck_webhooks(sessionmaker, config)
|
||||||
|
|||||||
600
src/alpine_bits_python/free_rooms_action.py
Normal file
600
src/alpine_bits_python/free_rooms_action.py
Normal file
@@ -0,0 +1,600 @@
|
|||||||
|
"""Action handler for OTA_HotelInvCountNotif:FreeRooms."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from dataclasses import dataclass
|
||||||
|
from datetime import UTC, date, datetime, timedelta
|
||||||
|
from typing import Any
|
||||||
|
|
||||||
|
from sqlalchemy import delete, select
|
||||||
|
from sqlalchemy.dialects.postgresql import insert as pg_insert
|
||||||
|
from sqlalchemy.dialects.sqlite import insert as sqlite_insert
|
||||||
|
from sqlalchemy.ext.asyncio import AsyncSession
|
||||||
|
from xsdata.formats.dataclass.serializers.config import SerializerConfig
|
||||||
|
from xsdata_pydantic.bindings import XmlParser, XmlSerializer
|
||||||
|
|
||||||
|
from .alpinebits_server import (
|
||||||
|
AlpineBitsAction,
|
||||||
|
AlpineBitsActionName,
|
||||||
|
AlpineBitsClientInfo,
|
||||||
|
AlpineBitsResponse,
|
||||||
|
Version,
|
||||||
|
validate_hotel_authentication,
|
||||||
|
)
|
||||||
|
from .const import HttpStatusCode
|
||||||
|
from .db import Hotel, HotelInventory, RoomAvailability
|
||||||
|
from .generated import (
|
||||||
|
ErrorType,
|
||||||
|
InvCountCountType,
|
||||||
|
OtaHotelInvCountNotifRq,
|
||||||
|
OtaHotelInvCountNotifRs,
|
||||||
|
UniqueIdInstance,
|
||||||
|
)
|
||||||
|
from .logging_config import get_logger
|
||||||
|
|
||||||
|
_LOGGER = get_logger(__name__)
|
||||||
|
|
||||||
|
SUPPORTED_CAPABILITIES = [
|
||||||
|
"OTA_HotelInvCountNotif_accept_rooms",
|
||||||
|
"OTA_HotelInvCountNotif_accept_categories",
|
||||||
|
"OTA_HotelInvCountNotif_accept_deltas",
|
||||||
|
"OTA_HotelInvCountNotif_accept_complete_set",
|
||||||
|
"OTA_HotelInvCountNotif_accept_out_of_order",
|
||||||
|
"OTA_HotelInvCountNotif_accept_out_of_market",
|
||||||
|
"OTA_HotelInvCountNotif_accept_closing_seasons",
|
||||||
|
]
|
||||||
|
|
||||||
|
CLOSING_SEASON_TYPE = "__CLOSE" # <= 8 chars per spec
|
||||||
|
SOURCE_FREEROOMS = "FreeRooms"
|
||||||
|
|
||||||
|
COUNT_TYPE_MAP = {
|
||||||
|
InvCountCountType.VALUE_2: "count_type_2",
|
||||||
|
InvCountCountType.VALUE_6: "count_type_6",
|
||||||
|
InvCountCountType.VALUE_9: "count_type_9",
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class FreeRoomsProcessingError(Exception):
|
||||||
|
"""Custom exception that carries HTTP and OTA error metadata."""
|
||||||
|
|
||||||
|
message: str
|
||||||
|
status_code: HttpStatusCode = HttpStatusCode.BAD_REQUEST
|
||||||
|
error_type: ErrorType = ErrorType.VALUE_13
|
||||||
|
code: str = "450"
|
||||||
|
|
||||||
|
def __str__(self) -> str:
|
||||||
|
return self.message
|
||||||
|
|
||||||
|
|
||||||
|
class FreeRoomsAction(AlpineBitsAction):
|
||||||
|
"""Handler for OTA_HotelInvCountNotif:FreeRooms requests."""
|
||||||
|
|
||||||
|
def __init__(self, config: dict | None = None):
|
||||||
|
self.name = AlpineBitsActionName.OTA_HOTEL_INV_COUNT_NOTIF_FREE_ROOMS
|
||||||
|
self.version = [Version.V2024_10, Version.V2022_10]
|
||||||
|
self.config = config or {}
|
||||||
|
self.supports = SUPPORTED_CAPABILITIES
|
||||||
|
|
||||||
|
self._parser = XmlParser()
|
||||||
|
self._serializer = XmlSerializer(
|
||||||
|
config=SerializerConfig(
|
||||||
|
pretty_print=True,
|
||||||
|
xml_declaration=True,
|
||||||
|
encoding="UTF-8",
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
async def handle(
|
||||||
|
self,
|
||||||
|
action: str,
|
||||||
|
request_xml: str,
|
||||||
|
version: Version,
|
||||||
|
client_info: AlpineBitsClientInfo,
|
||||||
|
dbsession: AsyncSession | None = None,
|
||||||
|
server_capabilities=None,
|
||||||
|
) -> AlpineBitsResponse:
|
||||||
|
"""Process FreeRooms inventory updates."""
|
||||||
|
try:
|
||||||
|
self._validate_action_name(action)
|
||||||
|
|
||||||
|
if request_xml is None:
|
||||||
|
raise FreeRoomsProcessingError("Missing request payload")
|
||||||
|
|
||||||
|
if dbsession is None:
|
||||||
|
raise FreeRoomsProcessingError(
|
||||||
|
"Database session unavailable",
|
||||||
|
HttpStatusCode.INTERNAL_SERVER_ERROR,
|
||||||
|
)
|
||||||
|
|
||||||
|
try:
|
||||||
|
request = self._parser.from_string(request_xml, OtaHotelInvCountNotifRq)
|
||||||
|
except Exception as exc: # pragma: no cover - serialization already tested upstream
|
||||||
|
_LOGGER.exception("Failed to parse FreeRooms request: %s", exc)
|
||||||
|
raise FreeRoomsProcessingError("Invalid XML payload") from exc
|
||||||
|
|
||||||
|
hotel_code = request.inventories.hotel_code if request.inventories else None
|
||||||
|
if not hotel_code:
|
||||||
|
raise FreeRoomsProcessingError("HotelCode attribute is required")
|
||||||
|
|
||||||
|
if not client_info or not client_info.username or not client_info.password:
|
||||||
|
raise FreeRoomsProcessingError(
|
||||||
|
"Missing authentication context",
|
||||||
|
HttpStatusCode.UNAUTHORIZED,
|
||||||
|
error_type=ErrorType.VALUE_11,
|
||||||
|
code="401",
|
||||||
|
)
|
||||||
|
|
||||||
|
if not validate_hotel_authentication(
|
||||||
|
client_info.username,
|
||||||
|
client_info.password,
|
||||||
|
hotel_code,
|
||||||
|
self.config,
|
||||||
|
):
|
||||||
|
raise FreeRoomsProcessingError(
|
||||||
|
f"Unauthorized FreeRooms notification for hotel {hotel_code}",
|
||||||
|
HttpStatusCode.UNAUTHORIZED,
|
||||||
|
error_type=ErrorType.VALUE_11,
|
||||||
|
code="401",
|
||||||
|
)
|
||||||
|
|
||||||
|
hotel = await self._fetch_hotel(dbsession, hotel_code)
|
||||||
|
if hotel is None:
|
||||||
|
raise FreeRoomsProcessingError(
|
||||||
|
f"Hotel {hotel_code} is not provisioned on this server"
|
||||||
|
)
|
||||||
|
|
||||||
|
is_complete_set = (
|
||||||
|
request.unique_id is not None
|
||||||
|
and request.unique_id.instance == UniqueIdInstance.COMPLETE_SET
|
||||||
|
)
|
||||||
|
update_type = "CompleteSet" if is_complete_set else "Delta"
|
||||||
|
|
||||||
|
inventory_cache: dict[tuple[str, str | None], HotelInventory] = {}
|
||||||
|
|
||||||
|
try:
|
||||||
|
if is_complete_set:
|
||||||
|
await self._process_complete_set(
|
||||||
|
dbsession, hotel, request, update_type, inventory_cache
|
||||||
|
)
|
||||||
|
else:
|
||||||
|
await self._process_delta(
|
||||||
|
dbsession, hotel, request, update_type, inventory_cache
|
||||||
|
)
|
||||||
|
await dbsession.commit()
|
||||||
|
except FreeRoomsProcessingError:
|
||||||
|
await dbsession.rollback()
|
||||||
|
raise
|
||||||
|
except Exception as exc: # pragma: no cover - defensive
|
||||||
|
await dbsession.rollback()
|
||||||
|
_LOGGER.exception("Unexpected FreeRooms failure: %s", exc)
|
||||||
|
return self._error_response(
|
||||||
|
"Internal server error while processing FreeRooms notification",
|
||||||
|
HttpStatusCode.INTERNAL_SERVER_ERROR,
|
||||||
|
)
|
||||||
|
|
||||||
|
_LOGGER.info(
|
||||||
|
"Processed FreeRooms %s update for hotel %s (%d inventory items)",
|
||||||
|
update_type,
|
||||||
|
hotel_code,
|
||||||
|
len(request.inventories.inventory),
|
||||||
|
)
|
||||||
|
return self._success_response()
|
||||||
|
except FreeRoomsProcessingError as exc:
|
||||||
|
return self._error_response(
|
||||||
|
exc.message,
|
||||||
|
exc.status_code,
|
||||||
|
error_type=exc.error_type,
|
||||||
|
code=exc.code,
|
||||||
|
)
|
||||||
|
|
||||||
|
def _validate_action_name(self, action: str) -> None:
|
||||||
|
expected = self.name.value[1]
|
||||||
|
if (action or "").strip() != expected:
|
||||||
|
raise FreeRoomsProcessingError(
|
||||||
|
f"Invalid action {action}, expected {expected}",
|
||||||
|
HttpStatusCode.BAD_REQUEST,
|
||||||
|
)
|
||||||
|
|
||||||
|
async def _fetch_hotel(self, session: AsyncSession, hotel_code: str) -> Hotel | None:
|
||||||
|
stmt = select(Hotel).where(Hotel.hotel_id == hotel_code, Hotel.is_active.is_(True))
|
||||||
|
result = await session.execute(stmt)
|
||||||
|
return result.scalar_one_or_none()
|
||||||
|
|
||||||
|
async def _process_complete_set(
|
||||||
|
self,
|
||||||
|
session: AsyncSession,
|
||||||
|
hotel: Hotel,
|
||||||
|
request: OtaHotelInvCountNotifRq,
|
||||||
|
update_type: str,
|
||||||
|
inventory_cache: dict[tuple[str, str | None], HotelInventory],
|
||||||
|
) -> None:
|
||||||
|
await self._delete_existing_availability(session, hotel.hotel_id)
|
||||||
|
await self._process_inventories(
|
||||||
|
session, hotel, request, update_type, inventory_cache, enforce_closing_order=True
|
||||||
|
)
|
||||||
|
|
||||||
|
async def _process_delta(
|
||||||
|
self,
|
||||||
|
session: AsyncSession,
|
||||||
|
hotel: Hotel,
|
||||||
|
request: OtaHotelInvCountNotifRq,
|
||||||
|
update_type: str,
|
||||||
|
inventory_cache: dict[tuple[str, str | None], HotelInventory],
|
||||||
|
) -> None:
|
||||||
|
await self._process_inventories(
|
||||||
|
session, hotel, request, update_type, inventory_cache, enforce_closing_order=False
|
||||||
|
)
|
||||||
|
|
||||||
|
async def _delete_existing_availability(
|
||||||
|
self,
|
||||||
|
session: AsyncSession,
|
||||||
|
hotel_id: str,
|
||||||
|
) -> None:
|
||||||
|
subquery = select(HotelInventory.id).where(HotelInventory.hotel_id == hotel_id)
|
||||||
|
await session.execute(
|
||||||
|
delete(RoomAvailability).where(RoomAvailability.inventory_id.in_(subquery))
|
||||||
|
)
|
||||||
|
|
||||||
|
async def _process_inventories(
|
||||||
|
self,
|
||||||
|
session: AsyncSession,
|
||||||
|
hotel: Hotel,
|
||||||
|
request: OtaHotelInvCountNotifRq,
|
||||||
|
update_type: str,
|
||||||
|
inventory_cache: dict[tuple[str, str | None], HotelInventory],
|
||||||
|
enforce_closing_order: bool,
|
||||||
|
) -> None:
|
||||||
|
inventories = request.inventories.inventory if request.inventories else []
|
||||||
|
if not inventories:
|
||||||
|
raise FreeRoomsProcessingError(
|
||||||
|
"Request must include at least one Inventory block",
|
||||||
|
HttpStatusCode.BAD_REQUEST,
|
||||||
|
)
|
||||||
|
|
||||||
|
rows_to_upsert: list[dict[str, Any]] = []
|
||||||
|
now = datetime.now(UTC)
|
||||||
|
encountered_standard = False
|
||||||
|
|
||||||
|
for inventory in inventories:
|
||||||
|
sac = inventory.status_application_control
|
||||||
|
if sac is None:
|
||||||
|
raise FreeRoomsProcessingError(
|
||||||
|
"StatusApplicationControl element is required for each Inventory",
|
||||||
|
HttpStatusCode.BAD_REQUEST,
|
||||||
|
)
|
||||||
|
|
||||||
|
is_closing = self._is_closing_season(sac)
|
||||||
|
if is_closing:
|
||||||
|
if inventory.inv_counts is not None:
|
||||||
|
raise FreeRoomsProcessingError(
|
||||||
|
"Closing seasons cannot contain InvCounts data",
|
||||||
|
HttpStatusCode.BAD_REQUEST,
|
||||||
|
)
|
||||||
|
if update_type != "CompleteSet":
|
||||||
|
raise FreeRoomsProcessingError(
|
||||||
|
"Closing seasons are only allowed on CompleteSet updates",
|
||||||
|
HttpStatusCode.BAD_REQUEST,
|
||||||
|
)
|
||||||
|
if enforce_closing_order and encountered_standard:
|
||||||
|
raise FreeRoomsProcessingError(
|
||||||
|
"Closing seasons must appear before other inventory entries",
|
||||||
|
HttpStatusCode.BAD_REQUEST,
|
||||||
|
)
|
||||||
|
rows_to_upsert.extend(
|
||||||
|
await self._process_closing_season(
|
||||||
|
session, hotel, sac, update_type, now, inventory_cache
|
||||||
|
)
|
||||||
|
)
|
||||||
|
continue
|
||||||
|
|
||||||
|
encountered_standard = True
|
||||||
|
rows_to_upsert.extend(
|
||||||
|
await self._process_inventory_item(
|
||||||
|
session,
|
||||||
|
hotel,
|
||||||
|
sac,
|
||||||
|
inventory.inv_counts,
|
||||||
|
update_type,
|
||||||
|
now,
|
||||||
|
inventory_cache,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
|
||||||
|
await self._upsert_availability_rows(session, rows_to_upsert)
|
||||||
|
|
||||||
|
async def _process_closing_season(
|
||||||
|
self,
|
||||||
|
session: AsyncSession,
|
||||||
|
hotel: Hotel,
|
||||||
|
sac: OtaHotelInvCountNotifRq.Inventories.Inventory.StatusApplicationControl,
|
||||||
|
update_type: str,
|
||||||
|
timestamp: datetime,
|
||||||
|
inventory_cache: dict[tuple[str, str | None], HotelInventory],
|
||||||
|
) -> list[dict[str, Any]]:
|
||||||
|
if sac.inv_type_code or sac.inv_code:
|
||||||
|
raise FreeRoomsProcessingError(
|
||||||
|
"Closing season entries cannot specify InvTypeCode or InvCode",
|
||||||
|
HttpStatusCode.BAD_REQUEST,
|
||||||
|
)
|
||||||
|
|
||||||
|
start_date, end_date = self._parse_date_range(sac.start, sac.end)
|
||||||
|
inventory_item = await self._ensure_inventory_item(
|
||||||
|
session,
|
||||||
|
hotel.hotel_id,
|
||||||
|
CLOSING_SEASON_TYPE,
|
||||||
|
None,
|
||||||
|
timestamp,
|
||||||
|
inventory_cache,
|
||||||
|
)
|
||||||
|
|
||||||
|
base_payload = {
|
||||||
|
"inventory_id": inventory_item.id,
|
||||||
|
"count_type_2": None,
|
||||||
|
"count_type_6": None,
|
||||||
|
"count_type_9": None,
|
||||||
|
"is_closing_season": True,
|
||||||
|
"last_updated": timestamp,
|
||||||
|
"update_type": update_type,
|
||||||
|
}
|
||||||
|
|
||||||
|
rows = []
|
||||||
|
for day in self._iter_days(start_date, end_date):
|
||||||
|
payload = dict(base_payload)
|
||||||
|
payload["date"] = day
|
||||||
|
rows.append(payload)
|
||||||
|
return rows
|
||||||
|
|
||||||
|
async def _process_inventory_item(
|
||||||
|
self,
|
||||||
|
session: AsyncSession,
|
||||||
|
hotel: Hotel,
|
||||||
|
sac: OtaHotelInvCountNotifRq.Inventories.Inventory.StatusApplicationControl,
|
||||||
|
inv_counts: (
|
||||||
|
OtaHotelInvCountNotifRq.Inventories.Inventory.InvCounts | None
|
||||||
|
),
|
||||||
|
update_type: str,
|
||||||
|
timestamp: datetime,
|
||||||
|
inventory_cache: dict[tuple[str, str | None], HotelInventory],
|
||||||
|
) -> list[dict[str, Any]]:
|
||||||
|
inv_type_code = (sac.inv_type_code or "").strip()
|
||||||
|
if not inv_type_code:
|
||||||
|
raise FreeRoomsProcessingError(
|
||||||
|
"InvTypeCode is required unless AllInvCode=\"true\"",
|
||||||
|
HttpStatusCode.BAD_REQUEST,
|
||||||
|
)
|
||||||
|
|
||||||
|
inv_code = sac.inv_code.strip() if sac.inv_code else None
|
||||||
|
start_date, end_date = self._parse_date_range(sac.start, sac.end)
|
||||||
|
|
||||||
|
counts = self._extract_counts(inv_counts)
|
||||||
|
base_counts = {
|
||||||
|
"count_type_2": counts.get("count_type_2"),
|
||||||
|
"count_type_6": counts.get("count_type_6"),
|
||||||
|
"count_type_9": counts.get("count_type_9"),
|
||||||
|
}
|
||||||
|
|
||||||
|
inventory_item = await self._ensure_inventory_item(
|
||||||
|
session,
|
||||||
|
hotel.hotel_id,
|
||||||
|
inv_type_code,
|
||||||
|
inv_code,
|
||||||
|
timestamp,
|
||||||
|
inventory_cache,
|
||||||
|
)
|
||||||
|
|
||||||
|
base_payload = {
|
||||||
|
"inventory_id": inventory_item.id,
|
||||||
|
"is_closing_season": False,
|
||||||
|
"last_updated": timestamp,
|
||||||
|
"update_type": update_type,
|
||||||
|
**base_counts,
|
||||||
|
}
|
||||||
|
|
||||||
|
rows = []
|
||||||
|
for day in self._iter_days(start_date, end_date):
|
||||||
|
payload = dict(base_payload)
|
||||||
|
payload["date"] = day
|
||||||
|
rows.append(payload)
|
||||||
|
return rows
|
||||||
|
|
||||||
|
def _parse_date_range(self, start_str: str, end_str: str) -> tuple[date, date]:
|
||||||
|
try:
|
||||||
|
start_date = date.fromisoformat(start_str)
|
||||||
|
end_date = date.fromisoformat(end_str)
|
||||||
|
except ValueError as exc:
|
||||||
|
raise FreeRoomsProcessingError(
|
||||||
|
f"Invalid date format: {exc!s}",
|
||||||
|
HttpStatusCode.BAD_REQUEST,
|
||||||
|
) from exc
|
||||||
|
|
||||||
|
if end_date < start_date:
|
||||||
|
raise FreeRoomsProcessingError(
|
||||||
|
"StatusApplicationControl End date cannot be before Start date",
|
||||||
|
HttpStatusCode.BAD_REQUEST,
|
||||||
|
)
|
||||||
|
return start_date, end_date
|
||||||
|
|
||||||
|
def _iter_days(self, start_date: date, end_date: date):
|
||||||
|
current = start_date
|
||||||
|
while current <= end_date:
|
||||||
|
yield current
|
||||||
|
current += timedelta(days=1)
|
||||||
|
|
||||||
|
def _is_closing_season(
|
||||||
|
self,
|
||||||
|
sac: OtaHotelInvCountNotifRq.Inventories.Inventory.StatusApplicationControl,
|
||||||
|
) -> bool:
|
||||||
|
return (sac.all_inv_code or "").strip().lower() == "true"
|
||||||
|
|
||||||
|
def _extract_counts(
|
||||||
|
self,
|
||||||
|
inv_counts: OtaHotelInvCountNotifRq.Inventories.Inventory.InvCounts | None,
|
||||||
|
) -> dict[str, int | None]:
|
||||||
|
if inv_counts is None or not inv_counts.inv_count:
|
||||||
|
return {}
|
||||||
|
|
||||||
|
parsed: dict[str, int] = {}
|
||||||
|
for count in inv_counts.inv_count:
|
||||||
|
column_name = COUNT_TYPE_MAP.get(count.count_type)
|
||||||
|
if column_name is None:
|
||||||
|
raise FreeRoomsProcessingError(
|
||||||
|
f"Unsupported CountType {count.count_type}",
|
||||||
|
HttpStatusCode.BAD_REQUEST,
|
||||||
|
)
|
||||||
|
|
||||||
|
if column_name in parsed:
|
||||||
|
raise FreeRoomsProcessingError(
|
||||||
|
f"Duplicate CountType {count.count_type.value} detected",
|
||||||
|
HttpStatusCode.BAD_REQUEST,
|
||||||
|
)
|
||||||
|
|
||||||
|
try:
|
||||||
|
value = int(count.count)
|
||||||
|
except ValueError as exc:
|
||||||
|
raise FreeRoomsProcessingError(
|
||||||
|
f"Invalid Count value '{count.count}'",
|
||||||
|
HttpStatusCode.BAD_REQUEST,
|
||||||
|
) from exc
|
||||||
|
|
||||||
|
if value < 0:
|
||||||
|
raise FreeRoomsProcessingError(
|
||||||
|
"Count values must be non-negative",
|
||||||
|
HttpStatusCode.BAD_REQUEST,
|
||||||
|
)
|
||||||
|
|
||||||
|
parsed[column_name] = value
|
||||||
|
|
||||||
|
return parsed
|
||||||
|
|
||||||
|
async def _ensure_inventory_item(
|
||||||
|
self,
|
||||||
|
session: AsyncSession,
|
||||||
|
hotel_id: str,
|
||||||
|
inv_type_code: str,
|
||||||
|
inv_code: str | None,
|
||||||
|
timestamp: datetime,
|
||||||
|
cache: dict[tuple[str, str | None], HotelInventory],
|
||||||
|
) -> HotelInventory:
|
||||||
|
cache_key = (inv_type_code, inv_code)
|
||||||
|
if cache_key in cache:
|
||||||
|
return cache[cache_key]
|
||||||
|
|
||||||
|
filters = [
|
||||||
|
HotelInventory.hotel_id == hotel_id,
|
||||||
|
HotelInventory.inv_type_code == inv_type_code,
|
||||||
|
]
|
||||||
|
if inv_code is None:
|
||||||
|
filters.append(HotelInventory.inv_code.is_(None))
|
||||||
|
else:
|
||||||
|
filters.append(HotelInventory.inv_code == inv_code)
|
||||||
|
|
||||||
|
stmt = select(HotelInventory).where(*filters)
|
||||||
|
result = await session.execute(stmt)
|
||||||
|
inventory_item = result.scalar_one_or_none()
|
||||||
|
|
||||||
|
if inventory_item:
|
||||||
|
inventory_item.last_updated = timestamp
|
||||||
|
else:
|
||||||
|
inventory_item = HotelInventory(
|
||||||
|
hotel_id=hotel_id,
|
||||||
|
inv_type_code=inv_type_code,
|
||||||
|
inv_code=inv_code,
|
||||||
|
source=SOURCE_FREEROOMS,
|
||||||
|
first_seen=timestamp,
|
||||||
|
last_updated=timestamp,
|
||||||
|
)
|
||||||
|
session.add(inventory_item)
|
||||||
|
await session.flush()
|
||||||
|
|
||||||
|
cache[cache_key] = inventory_item
|
||||||
|
return inventory_item
|
||||||
|
|
||||||
|
async def _upsert_availability_rows(
|
||||||
|
self,
|
||||||
|
session: AsyncSession,
|
||||||
|
rows: list[dict[str, Any]],
|
||||||
|
) -> None:
|
||||||
|
if not rows:
|
||||||
|
return
|
||||||
|
|
||||||
|
bind = session.get_bind()
|
||||||
|
dialect_name = bind.dialect.name if bind else ""
|
||||||
|
table = RoomAvailability.__table__
|
||||||
|
|
||||||
|
if dialect_name == "postgresql":
|
||||||
|
stmt = pg_insert(table).values(rows)
|
||||||
|
stmt = stmt.on_conflict_do_update(
|
||||||
|
index_elements=["inventory_id", "date"],
|
||||||
|
set_=self._build_upsert_set(stmt),
|
||||||
|
)
|
||||||
|
await session.execute(stmt)
|
||||||
|
return
|
||||||
|
|
||||||
|
if dialect_name == "sqlite":
|
||||||
|
stmt = sqlite_insert(table).values(rows)
|
||||||
|
stmt = stmt.on_conflict_do_update(
|
||||||
|
index_elements=["inventory_id", "date"],
|
||||||
|
set_=self._build_upsert_set(stmt),
|
||||||
|
)
|
||||||
|
await session.execute(stmt)
|
||||||
|
return
|
||||||
|
|
||||||
|
await self._upsert_with_fallback(session, rows)
|
||||||
|
|
||||||
|
def _build_upsert_set(self, stmt):
|
||||||
|
return {
|
||||||
|
"count_type_2": stmt.excluded.count_type_2,
|
||||||
|
"count_type_6": stmt.excluded.count_type_6,
|
||||||
|
"count_type_9": stmt.excluded.count_type_9,
|
||||||
|
"is_closing_season": stmt.excluded.is_closing_season,
|
||||||
|
"last_updated": stmt.excluded.last_updated,
|
||||||
|
"update_type": stmt.excluded.update_type,
|
||||||
|
}
|
||||||
|
|
||||||
|
async def _upsert_with_fallback(
|
||||||
|
self, session: AsyncSession, rows: list[dict[str, Any]]
|
||||||
|
) -> None:
|
||||||
|
for row in rows:
|
||||||
|
stmt = select(RoomAvailability).where(
|
||||||
|
RoomAvailability.inventory_id == row["inventory_id"],
|
||||||
|
RoomAvailability.date == row["date"],
|
||||||
|
)
|
||||||
|
result = await session.execute(stmt)
|
||||||
|
existing = result.scalar_one_or_none()
|
||||||
|
|
||||||
|
if existing:
|
||||||
|
existing.count_type_2 = row["count_type_2"]
|
||||||
|
existing.count_type_6 = row["count_type_6"]
|
||||||
|
existing.count_type_9 = row["count_type_9"]
|
||||||
|
existing.is_closing_season = row["is_closing_season"]
|
||||||
|
existing.last_updated = row["last_updated"]
|
||||||
|
existing.update_type = row["update_type"]
|
||||||
|
else:
|
||||||
|
session.add(RoomAvailability(**row))
|
||||||
|
|
||||||
|
def _success_response(self) -> AlpineBitsResponse:
|
||||||
|
response = OtaHotelInvCountNotifRs(version="7.000", success="")
|
||||||
|
xml = self._serializer.render(
|
||||||
|
response, ns_map={None: "http://www.opentravel.org/OTA/2003/05"}
|
||||||
|
)
|
||||||
|
return AlpineBitsResponse(xml, HttpStatusCode.OK)
|
||||||
|
|
||||||
|
def _error_response(
|
||||||
|
self,
|
||||||
|
message: str,
|
||||||
|
status_code: HttpStatusCode,
|
||||||
|
error_type: ErrorType = ErrorType.VALUE_13,
|
||||||
|
code: str = "450",
|
||||||
|
) -> AlpineBitsResponse:
|
||||||
|
error = OtaHotelInvCountNotifRs.Errors.Error(
|
||||||
|
type_value=error_type,
|
||||||
|
code=code,
|
||||||
|
content=[message],
|
||||||
|
)
|
||||||
|
errors = OtaHotelInvCountNotifRs.Errors(error=[error])
|
||||||
|
response = OtaHotelInvCountNotifRs(version="7.000", errors=errors)
|
||||||
|
xml = self._serializer.render(
|
||||||
|
response, ns_map={None: "http://www.opentravel.org/OTA/2003/05"}
|
||||||
|
)
|
||||||
|
return AlpineBitsResponse(xml, status_code)
|
||||||
246
src/alpine_bits_python/hotel_service.py
Normal file
246
src/alpine_bits_python/hotel_service.py
Normal file
@@ -0,0 +1,246 @@
|
|||||||
|
"""Hotel service for managing hotel configuration."""
|
||||||
|
|
||||||
|
import secrets
|
||||||
|
from datetime import UTC, datetime
|
||||||
|
from typing import Any
|
||||||
|
|
||||||
|
import bcrypt
|
||||||
|
from sqlalchemy import and_, select
|
||||||
|
from sqlalchemy.ext.asyncio import AsyncSession
|
||||||
|
from sqlalchemy.orm import joinedload
|
||||||
|
|
||||||
|
from .db import Hotel, WebhookEndpoint
|
||||||
|
from .logging_config import get_logger
|
||||||
|
|
||||||
|
_LOGGER = get_logger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
def hash_password(password: str) -> str:
|
||||||
|
"""Hash password using bcrypt.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
password: Plain text password
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Bcrypt hashed password
|
||||||
|
"""
|
||||||
|
salt = bcrypt.gensalt(rounds=12)
|
||||||
|
return bcrypt.hashpw(password.encode('utf-8'), salt).decode('utf-8')
|
||||||
|
|
||||||
|
|
||||||
|
def verify_password(password: str, password_hash: str) -> bool:
|
||||||
|
"""Verify password against bcrypt hash.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
password: Plain text password
|
||||||
|
password_hash: Bcrypt hash to verify against
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
True if password matches, False otherwise
|
||||||
|
"""
|
||||||
|
return bcrypt.checkpw(
|
||||||
|
password.encode('utf-8'),
|
||||||
|
password_hash.encode('utf-8')
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def generate_webhook_secret() -> str:
|
||||||
|
"""Generate cryptographically secure webhook secret.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
64-character URL-safe random string
|
||||||
|
"""
|
||||||
|
return secrets.token_urlsafe(48) # 48 bytes = 64 URL-safe chars
|
||||||
|
|
||||||
|
|
||||||
|
async def sync_config_to_database(
|
||||||
|
db_session: AsyncSession,
|
||||||
|
config: dict[str, Any]
|
||||||
|
) -> dict[str, int]:
|
||||||
|
"""Sync alpine_bits_auth from config.yaml to database.
|
||||||
|
|
||||||
|
Creates/updates hotels and generates webhook_endpoints if missing.
|
||||||
|
Idempotent - safe to run on every startup.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
db_session: Database session
|
||||||
|
config: Application configuration dict
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Statistics dict with counts of created/updated records
|
||||||
|
"""
|
||||||
|
stats = {"hotels_created": 0, "hotels_updated": 0, "endpoints_created": 0}
|
||||||
|
|
||||||
|
alpine_bits_auth = config.get("alpine_bits_auth", [])
|
||||||
|
if not alpine_bits_auth:
|
||||||
|
_LOGGER.info("No hotels found in alpine_bits_auth config")
|
||||||
|
return stats
|
||||||
|
|
||||||
|
for hotel_config in alpine_bits_auth:
|
||||||
|
hotel_id = hotel_config.get("hotel_id")
|
||||||
|
if not hotel_id:
|
||||||
|
_LOGGER.warning("Skipping hotel config without hotel_id: %s", hotel_config)
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Check if hotel exists
|
||||||
|
result = await db_session.execute(
|
||||||
|
select(Hotel).where(Hotel.hotel_id == hotel_id)
|
||||||
|
)
|
||||||
|
hotel = result.scalar_one_or_none()
|
||||||
|
|
||||||
|
if not hotel:
|
||||||
|
# Create new hotel
|
||||||
|
password_hash = hash_password(hotel_config["password"])
|
||||||
|
|
||||||
|
hotel = Hotel(
|
||||||
|
hotel_id=hotel_id,
|
||||||
|
hotel_name=hotel_config.get("hotel_name", hotel_id),
|
||||||
|
username=hotel_config["username"],
|
||||||
|
password_hash=password_hash,
|
||||||
|
meta_account_id=hotel_config.get("meta_account"),
|
||||||
|
google_account_id=hotel_config.get("google_account"),
|
||||||
|
push_endpoint_url=hotel_config.get("push_endpoint", {}).get("url"),
|
||||||
|
push_endpoint_token=hotel_config.get("push_endpoint", {}).get("token"),
|
||||||
|
push_endpoint_username=hotel_config.get("push_endpoint", {}).get("username"),
|
||||||
|
created_at=datetime.now(UTC),
|
||||||
|
updated_at=datetime.now(UTC),
|
||||||
|
is_active=True,
|
||||||
|
)
|
||||||
|
db_session.add(hotel)
|
||||||
|
await db_session.flush()
|
||||||
|
stats["hotels_created"] += 1
|
||||||
|
_LOGGER.info("Created hotel: %s", hotel_id)
|
||||||
|
else:
|
||||||
|
# Update existing hotel (config may have changed)
|
||||||
|
# Note: We do NOT update password_hash for security reasons
|
||||||
|
hotel.hotel_name = hotel_config.get("hotel_name", hotel_id)
|
||||||
|
hotel.meta_account_id = hotel_config.get("meta_account")
|
||||||
|
hotel.google_account_id = hotel_config.get("google_account")
|
||||||
|
push_endpoint = hotel_config.get("push_endpoint", {})
|
||||||
|
hotel.push_endpoint_url = push_endpoint.get("url")
|
||||||
|
hotel.push_endpoint_token = push_endpoint.get("token")
|
||||||
|
hotel.push_endpoint_username = push_endpoint.get("username")
|
||||||
|
hotel.updated_at = datetime.now(UTC)
|
||||||
|
stats["hotels_updated"] += 1
|
||||||
|
_LOGGER.debug("Updated hotel: %s", hotel_id)
|
||||||
|
|
||||||
|
# Ensure hotel has at least default webhook endpoints
|
||||||
|
result = await db_session.execute(
|
||||||
|
select(WebhookEndpoint).where(WebhookEndpoint.hotel_id == hotel_id)
|
||||||
|
)
|
||||||
|
existing_endpoints = result.scalars().all()
|
||||||
|
|
||||||
|
if not existing_endpoints:
|
||||||
|
# Create default webhook endpoints for backward compatibility
|
||||||
|
for webhook_type in ["wix_form", "generic"]:
|
||||||
|
webhook_secret = generate_webhook_secret()
|
||||||
|
endpoint = WebhookEndpoint(
|
||||||
|
hotel_id=hotel_id,
|
||||||
|
webhook_secret=webhook_secret,
|
||||||
|
webhook_type=webhook_type,
|
||||||
|
description=f"Auto-generated {webhook_type} endpoint",
|
||||||
|
is_enabled=True,
|
||||||
|
created_at=datetime.now(UTC),
|
||||||
|
)
|
||||||
|
db_session.add(endpoint)
|
||||||
|
stats["endpoints_created"] += 1
|
||||||
|
_LOGGER.info(
|
||||||
|
"Created webhook endpoint for hotel %s, type=%s, secret=%s",
|
||||||
|
hotel_id,
|
||||||
|
webhook_type,
|
||||||
|
webhook_secret
|
||||||
|
)
|
||||||
|
|
||||||
|
await db_session.commit()
|
||||||
|
|
||||||
|
_LOGGER.info(
|
||||||
|
"Config sync complete: %d hotels created, %d updated, %d endpoints created",
|
||||||
|
stats["hotels_created"],
|
||||||
|
stats["hotels_updated"],
|
||||||
|
stats["endpoints_created"]
|
||||||
|
)
|
||||||
|
|
||||||
|
return stats
|
||||||
|
|
||||||
|
|
||||||
|
class HotelService:
|
||||||
|
"""Service for hotel configuration access.
|
||||||
|
|
||||||
|
Always reads from database (synced from config at startup).
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, db_session: AsyncSession):
|
||||||
|
"""Initialize HotelService.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
db_session: Database session
|
||||||
|
"""
|
||||||
|
self.db_session = db_session
|
||||||
|
|
||||||
|
async def get_hotel_by_id(self, hotel_id: str) -> Hotel | None:
|
||||||
|
"""Get hotel by hotel_id.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
hotel_id: Hotel identifier
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Hotel instance or None if not found
|
||||||
|
"""
|
||||||
|
result = await self.db_session.execute(
|
||||||
|
select(Hotel)
|
||||||
|
.where(
|
||||||
|
and_(
|
||||||
|
Hotel.hotel_id == hotel_id,
|
||||||
|
Hotel.is_active == True
|
||||||
|
)
|
||||||
|
)
|
||||||
|
)
|
||||||
|
return result.scalar_one_or_none()
|
||||||
|
|
||||||
|
async def get_hotel_by_webhook_secret(
|
||||||
|
self,
|
||||||
|
webhook_secret: str
|
||||||
|
) -> tuple[Hotel, WebhookEndpoint] | tuple[None, None]:
|
||||||
|
"""Get hotel and webhook_endpoint by webhook_secret.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
webhook_secret: Webhook secret string
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Tuple of (Hotel, WebhookEndpoint) or (None, None) if not found
|
||||||
|
"""
|
||||||
|
result = await self.db_session.execute(
|
||||||
|
select(WebhookEndpoint)
|
||||||
|
.where(
|
||||||
|
and_(
|
||||||
|
WebhookEndpoint.webhook_secret == webhook_secret,
|
||||||
|
WebhookEndpoint.is_enabled == True
|
||||||
|
)
|
||||||
|
)
|
||||||
|
.options(joinedload(WebhookEndpoint.hotel))
|
||||||
|
)
|
||||||
|
endpoint = result.scalar_one_or_none()
|
||||||
|
|
||||||
|
if endpoint and endpoint.hotel.is_active:
|
||||||
|
return endpoint.hotel, endpoint
|
||||||
|
return None, None
|
||||||
|
|
||||||
|
async def get_hotel_by_username(self, username: str) -> Hotel | None:
|
||||||
|
"""Get hotel by AlpineBits username.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
username: AlpineBits username
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Hotel instance or None if not found
|
||||||
|
"""
|
||||||
|
result = await self.db_session.execute(
|
||||||
|
select(Hotel)
|
||||||
|
.where(
|
||||||
|
and_(
|
||||||
|
Hotel.username == username,
|
||||||
|
Hotel.is_active == True
|
||||||
|
)
|
||||||
|
)
|
||||||
|
)
|
||||||
|
return result.scalar_one_or_none()
|
||||||
@@ -10,11 +10,15 @@ from XML generation (xsdata) follows clean architecture principles.
|
|||||||
"""
|
"""
|
||||||
|
|
||||||
import hashlib
|
import hashlib
|
||||||
|
import json
|
||||||
from datetime import date, datetime
|
from datetime import date, datetime
|
||||||
from enum import Enum
|
from enum import Enum
|
||||||
|
from typing import Any
|
||||||
|
|
||||||
from pydantic import BaseModel, EmailStr, Field, field_validator, model_validator
|
from pydantic import BaseModel, EmailStr, Field, field_validator, model_validator
|
||||||
|
|
||||||
|
from .const import WebhookStatus
|
||||||
|
|
||||||
|
|
||||||
# Country name to ISO 3166-1 alpha-2 code mapping
|
# Country name to ISO 3166-1 alpha-2 code mapping
|
||||||
COUNTRY_NAME_TO_CODE = {
|
COUNTRY_NAME_TO_CODE = {
|
||||||
@@ -308,6 +312,148 @@ class CommentsData(BaseModel):
|
|||||||
model_config = {"from_attributes": True}
|
model_config = {"from_attributes": True}
|
||||||
|
|
||||||
|
|
||||||
|
class HotelData(BaseModel):
|
||||||
|
"""Validated hotel configuration data."""
|
||||||
|
|
||||||
|
hotel_id: str = Field(..., min_length=1, max_length=50)
|
||||||
|
hotel_name: str = Field(..., min_length=1, max_length=200)
|
||||||
|
username: str = Field(..., min_length=1, max_length=100)
|
||||||
|
password_hash: str = Field(..., min_length=1, max_length=200)
|
||||||
|
meta_account_id: str | None = Field(None, max_length=50)
|
||||||
|
google_account_id: str | None = Field(None, max_length=50)
|
||||||
|
push_endpoint_url: str | None = Field(None, max_length=500)
|
||||||
|
push_endpoint_token: str | None = Field(None, max_length=200)
|
||||||
|
push_endpoint_username: str | None = Field(None, max_length=100)
|
||||||
|
created_at: datetime = Field(default_factory=lambda: datetime.now())
|
||||||
|
updated_at: datetime = Field(default_factory=lambda: datetime.now())
|
||||||
|
is_active: bool = Field(default=True)
|
||||||
|
|
||||||
|
@field_validator("hotel_id", "hotel_name", "username")
|
||||||
|
@classmethod
|
||||||
|
def strip_whitespace(cls, v: str) -> str:
|
||||||
|
"""Remove leading/trailing whitespace."""
|
||||||
|
return v.strip()
|
||||||
|
|
||||||
|
model_config = {"from_attributes": True}
|
||||||
|
|
||||||
|
|
||||||
|
class WebhookEndpointData(BaseModel):
|
||||||
|
"""Validated webhook endpoint configuration data."""
|
||||||
|
|
||||||
|
hotel_id: str = Field(..., min_length=1, max_length=50)
|
||||||
|
webhook_secret: str = Field(..., min_length=1, max_length=64)
|
||||||
|
webhook_type: str = Field(..., min_length=1, max_length=50)
|
||||||
|
description: str | None = Field(None, max_length=200)
|
||||||
|
is_enabled: bool = Field(default=True)
|
||||||
|
created_at: datetime = Field(default_factory=lambda: datetime.now())
|
||||||
|
|
||||||
|
@field_validator("hotel_id", "webhook_secret", "webhook_type")
|
||||||
|
@classmethod
|
||||||
|
def strip_whitespace(cls, v: str) -> str:
|
||||||
|
"""Remove leading/trailing whitespace."""
|
||||||
|
return v.strip()
|
||||||
|
|
||||||
|
model_config = {"from_attributes": True}
|
||||||
|
|
||||||
|
|
||||||
|
class WebhookRequestData(BaseModel):
|
||||||
|
"""Validated webhook request data.
|
||||||
|
|
||||||
|
This model handles the special case where:
|
||||||
|
- payload_json is required for creation (to calculate payload_hash)
|
||||||
|
- payload_json becomes optional after processing (can be purged for privacy/storage)
|
||||||
|
- payload_hash is auto-calculated from payload_json when provided
|
||||||
|
"""
|
||||||
|
|
||||||
|
# Required fields
|
||||||
|
payload_json: dict[str, Any] | None = Field(
|
||||||
|
...,
|
||||||
|
description="Webhook payload (required for creation, nullable after purge)"
|
||||||
|
)
|
||||||
|
|
||||||
|
# Auto-calculated from payload_json
|
||||||
|
payload_hash: str | None = Field(
|
||||||
|
None,
|
||||||
|
min_length=64,
|
||||||
|
max_length=64,
|
||||||
|
description="SHA256 hash of canonical JSON payload (auto-calculated)"
|
||||||
|
)
|
||||||
|
|
||||||
|
# Optional foreign keys
|
||||||
|
webhook_endpoint_id: int | None = Field(None, gt=0)
|
||||||
|
hotel_id: str | None = Field(None, max_length=50)
|
||||||
|
|
||||||
|
# Processing tracking
|
||||||
|
status: WebhookStatus = Field(default=WebhookStatus.PENDING)
|
||||||
|
processing_started_at: datetime | None = None
|
||||||
|
processing_completed_at: datetime | None = None
|
||||||
|
|
||||||
|
# Retry handling
|
||||||
|
retry_count: int = Field(default=0, ge=0)
|
||||||
|
last_error: str | None = Field(None, max_length=2000)
|
||||||
|
|
||||||
|
# Payload metadata
|
||||||
|
purged_at: datetime | None = None
|
||||||
|
|
||||||
|
# Request metadata
|
||||||
|
created_at: datetime = Field(default_factory=lambda: datetime.now())
|
||||||
|
source_ip: str | None = Field(None, max_length=45)
|
||||||
|
user_agent: str | None = Field(None, max_length=500)
|
||||||
|
|
||||||
|
# Result tracking
|
||||||
|
created_customer_id: int | None = Field(None, gt=0)
|
||||||
|
created_reservation_id: int | None = Field(None, gt=0)
|
||||||
|
|
||||||
|
@model_validator(mode="after")
|
||||||
|
def calculate_payload_hash(self) -> "WebhookRequestData":
|
||||||
|
"""Auto-calculate payload_hash from payload_json if not provided.
|
||||||
|
|
||||||
|
Uses the same hashing algorithm as api.py:
|
||||||
|
- Canonical JSON with sorted keys
|
||||||
|
- UTF-8 encoding
|
||||||
|
- SHA256 hash
|
||||||
|
|
||||||
|
This runs after all field validation, so we can access the validated payload_json.
|
||||||
|
"""
|
||||||
|
# Only calculate if payload_json is provided and payload_hash is not set
|
||||||
|
if self.payload_json is not None and self.payload_hash is None:
|
||||||
|
# Create canonical JSON string (sorted keys for consistency)
|
||||||
|
payload_json_str = json.dumps(self.payload_json, sort_keys=True)
|
||||||
|
# Calculate SHA256 hash
|
||||||
|
self.payload_hash = hashlib.sha256(
|
||||||
|
payload_json_str.encode("utf-8")
|
||||||
|
).hexdigest()
|
||||||
|
|
||||||
|
return self
|
||||||
|
|
||||||
|
@model_validator(mode="after")
|
||||||
|
def validate_payload_hash_requirements(self) -> "WebhookRequestData":
|
||||||
|
"""Ensure payload_hash is present (either provided or calculated).
|
||||||
|
|
||||||
|
This validator runs after calculate_payload_hash, so payload_hash should
|
||||||
|
be set if payload_json was provided.
|
||||||
|
"""
|
||||||
|
if self.payload_hash is None:
|
||||||
|
raise ValueError(
|
||||||
|
"payload_hash is required. It can be auto-calculated from payload_json "
|
||||||
|
"or explicitly provided."
|
||||||
|
)
|
||||||
|
|
||||||
|
return self
|
||||||
|
|
||||||
|
@field_validator("status", mode="before")
|
||||||
|
@classmethod
|
||||||
|
def normalize_status(cls, v: str | WebhookStatus) -> WebhookStatus:
|
||||||
|
"""Normalize status to WebhookStatus enum."""
|
||||||
|
if isinstance(v, WebhookStatus):
|
||||||
|
return v
|
||||||
|
if isinstance(v, str):
|
||||||
|
return WebhookStatus(v)
|
||||||
|
raise ValueError(f"Invalid webhook status: {v}")
|
||||||
|
|
||||||
|
model_config = {"from_attributes": True}
|
||||||
|
|
||||||
|
|
||||||
# Example usage in a service layer
|
# Example usage in a service layer
|
||||||
class ReservationService:
|
class ReservationService:
|
||||||
"""Example service showing how to use Pydantic models with SQLAlchemy."""
|
"""Example service showing how to use Pydantic models with SQLAlchemy."""
|
||||||
|
|||||||
717
src/alpine_bits_python/webhook_processor.py
Normal file
717
src/alpine_bits_python/webhook_processor.py
Normal file
@@ -0,0 +1,717 @@
|
|||||||
|
"""Webhook processor interface and implementations."""
|
||||||
|
|
||||||
|
import asyncio
|
||||||
|
from datetime import date, datetime
|
||||||
|
from typing import Any, Protocol
|
||||||
|
|
||||||
|
from fastapi import HTTPException
|
||||||
|
from sqlalchemy.exc import IntegrityError
|
||||||
|
from sqlalchemy.ext.asyncio import AsyncSession
|
||||||
|
|
||||||
|
from alpine_bits_python.auth import generate_unique_id
|
||||||
|
from alpine_bits_python.config_loader import get_advertising_account_ids
|
||||||
|
from alpine_bits_python.customer_service import CustomerService
|
||||||
|
from alpine_bits_python.reservation_service import ReservationService
|
||||||
|
from alpine_bits_python.schemas import ReservationData
|
||||||
|
|
||||||
|
from .db import WebhookRequest
|
||||||
|
from .logging_config import get_logger
|
||||||
|
|
||||||
|
_LOGGER = get_logger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
class WebhookProcessorProtocol(Protocol):
|
||||||
|
"""Protocol for webhook processors."""
|
||||||
|
|
||||||
|
@property
|
||||||
|
def webhook_type(self) -> str:
|
||||||
|
"""Return webhook type identifier (e.g., 'wix_form', 'generic')."""
|
||||||
|
...
|
||||||
|
|
||||||
|
async def process(
|
||||||
|
self,
|
||||||
|
webhook_request: WebhookRequest,
|
||||||
|
db_session: AsyncSession,
|
||||||
|
config: dict[str, Any] | None = None,
|
||||||
|
event_dispatcher: Any | None = None,
|
||||||
|
) -> dict[str, Any]:
|
||||||
|
"""Process webhook payload.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
webhook_request: WebhookRequest database record (contains payload_json and hotel_id)
|
||||||
|
db_session: Database session
|
||||||
|
config: Application configuration (optional)
|
||||||
|
event_dispatcher: Event dispatcher for push notifications (optional)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Response dict with status, message, customer_id, reservation_id
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
HTTPException on processing errors
|
||||||
|
|
||||||
|
"""
|
||||||
|
...
|
||||||
|
|
||||||
|
|
||||||
|
class WebhookProcessorRegistry:
|
||||||
|
"""Registry for webhook processors."""
|
||||||
|
|
||||||
|
def __init__(self):
|
||||||
|
"""Initialize the registry."""
|
||||||
|
self._processors: dict[str, WebhookProcessorProtocol] = {}
|
||||||
|
|
||||||
|
def register(self, processor: WebhookProcessorProtocol) -> None:
|
||||||
|
"""Register a webhook processor.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
processor: Processor instance to register
|
||||||
|
|
||||||
|
"""
|
||||||
|
self._processors[processor.webhook_type] = processor
|
||||||
|
_LOGGER.info("Registered webhook processor: %s", processor.webhook_type)
|
||||||
|
|
||||||
|
def get_processor(self, webhook_type: str) -> WebhookProcessorProtocol | None:
|
||||||
|
"""Get processor for webhook type.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
webhook_type: Type of webhook to process
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Processor instance or None if not found
|
||||||
|
|
||||||
|
"""
|
||||||
|
return self._processors.get(webhook_type)
|
||||||
|
|
||||||
|
|
||||||
|
async def process_wix_form_submission(
|
||||||
|
data: dict[str, Any],
|
||||||
|
db_session: AsyncSession,
|
||||||
|
config: dict[str, Any] | None = None,
|
||||||
|
hotel_id: str | None = None,
|
||||||
|
event_dispatcher: Any | None = None,
|
||||||
|
):
|
||||||
|
"""Shared business logic for handling Wix form submissions (test and production).
|
||||||
|
|
||||||
|
Args:
|
||||||
|
data: Webhook payload data
|
||||||
|
db_session: Database session
|
||||||
|
config: Application config (optional)
|
||||||
|
hotel_id: Hotel ID (optional, will use from data or config default if not provided)
|
||||||
|
event_dispatcher: Event dispatcher for push notifications (optional)
|
||||||
|
"""
|
||||||
|
timestamp = datetime.now().isoformat()
|
||||||
|
|
||||||
|
_LOGGER.info("Received Wix form data at %s", timestamp)
|
||||||
|
|
||||||
|
# Provide fallback config if still None
|
||||||
|
if config is None:
|
||||||
|
config = {}
|
||||||
|
|
||||||
|
data = data.get("data") # Handle nested "data" key if present
|
||||||
|
|
||||||
|
# save customer and reservation to DB
|
||||||
|
|
||||||
|
contact_info = data.get("contact", {})
|
||||||
|
first_name = contact_info.get("name", {}).get("first")
|
||||||
|
last_name = contact_info.get("name", {}).get("last")
|
||||||
|
email = contact_info.get("email")
|
||||||
|
phone_number = contact_info.get("phones", [{}])[0].get("e164Phone")
|
||||||
|
contact_info.get("locale", "de-de")
|
||||||
|
contact_id = contact_info.get("contactId")
|
||||||
|
|
||||||
|
name_prefix = data.get("field:anrede")
|
||||||
|
|
||||||
|
email_newsletter = data.get("field:form_field_5a7b", False)
|
||||||
|
|
||||||
|
# if email_newsletter is a string, attempt to convert to boolean, else false
|
||||||
|
if isinstance(email_newsletter, str):
|
||||||
|
email_newsletter = email_newsletter.lower() in [
|
||||||
|
"yes",
|
||||||
|
"true",
|
||||||
|
"1",
|
||||||
|
"on",
|
||||||
|
"selezionato",
|
||||||
|
"angekreuzt",
|
||||||
|
]
|
||||||
|
|
||||||
|
address_line = None
|
||||||
|
city_name = None
|
||||||
|
postal_code = None
|
||||||
|
country_code = None
|
||||||
|
gender = None
|
||||||
|
birth_date = None
|
||||||
|
language = data.get("contact", {}).get("locale", "en")[:2]
|
||||||
|
|
||||||
|
# Dates
|
||||||
|
start_date = (
|
||||||
|
data.get("field:date_picker_a7c8")
|
||||||
|
or data.get("Anreisedatum")
|
||||||
|
or data.get("submissions", [{}])[1].get("value")
|
||||||
|
)
|
||||||
|
end_date = (
|
||||||
|
data.get("field:date_picker_7e65")
|
||||||
|
or data.get("Abreisedatum")
|
||||||
|
or data.get("submissions", [{}])[2].get("value")
|
||||||
|
)
|
||||||
|
|
||||||
|
# Room/guest info
|
||||||
|
num_adults = int(data.get("field:number_7cf5") or 1)
|
||||||
|
num_children = int(data.get("field:anzahl_kinder") or 0)
|
||||||
|
children_ages = []
|
||||||
|
if num_children > 0:
|
||||||
|
# Collect all child age fields, then take only the first num_children
|
||||||
|
# This handles form updates that may send extra padded/zero fields
|
||||||
|
temp_ages = []
|
||||||
|
for k in data:
|
||||||
|
if k.startswith("field:alter_kind_"):
|
||||||
|
if data[k] is None or data[k] == "":
|
||||||
|
continue
|
||||||
|
try:
|
||||||
|
age = int(data[k])
|
||||||
|
temp_ages.append(age)
|
||||||
|
except ValueError:
|
||||||
|
_LOGGER.warning("Invalid age value for %s: %s", k, data[k])
|
||||||
|
|
||||||
|
# Only keep the first num_children ages, regardless of their values
|
||||||
|
children_ages = temp_ages[:num_children]
|
||||||
|
|
||||||
|
offer = data.get("field:angebot_auswaehlen")
|
||||||
|
|
||||||
|
# get submissionId and ensure max length 35. Generate one if not present
|
||||||
|
|
||||||
|
unique_id = data.get("submissionId", generate_unique_id())
|
||||||
|
|
||||||
|
# Use CustomerService to handle customer creation/update with hashing
|
||||||
|
customer_service = CustomerService(db_session)
|
||||||
|
|
||||||
|
customer_data = {
|
||||||
|
"given_name": first_name,
|
||||||
|
"surname": last_name,
|
||||||
|
"contact_id": contact_id,
|
||||||
|
"name_prefix": name_prefix,
|
||||||
|
"email_address": email,
|
||||||
|
"phone": phone_number,
|
||||||
|
"email_newsletter": email_newsletter,
|
||||||
|
"address_line": address_line,
|
||||||
|
"city_name": city_name,
|
||||||
|
"postal_code": postal_code,
|
||||||
|
"country_code": country_code,
|
||||||
|
"gender": gender,
|
||||||
|
"birth_date": birth_date,
|
||||||
|
"language": language,
|
||||||
|
"address_catalog": False,
|
||||||
|
"name_title": None,
|
||||||
|
}
|
||||||
|
|
||||||
|
# This automatically creates/updates both Customer and HashedCustomer
|
||||||
|
db_customer = await customer_service.get_or_create_customer(customer_data)
|
||||||
|
|
||||||
|
# Determine hotel_code and hotel_name
|
||||||
|
# Priority: 1) Passed hotel_id, 2) Form field, 3) Config default, 4) Fallback
|
||||||
|
hotel_code = hotel_id or data.get("field:hotelid", None)
|
||||||
|
|
||||||
|
if hotel_code is None:
|
||||||
|
_LOGGER.warning("No hotel_code provided, using default from config")
|
||||||
|
hotel_code = config.get("default_hotel_code", "123")
|
||||||
|
|
||||||
|
hotel_name = (
|
||||||
|
data.get("field:hotelname")
|
||||||
|
or data.get("hotelname")
|
||||||
|
or config.get("default_hotel_name")
|
||||||
|
or "Frangart Inn" # fallback
|
||||||
|
)
|
||||||
|
|
||||||
|
submissionTime = data.get("submissionTime") # 2025-10-07T05:48:41.855Z
|
||||||
|
try:
|
||||||
|
if submissionTime:
|
||||||
|
submissionTime = datetime.fromisoformat(
|
||||||
|
submissionTime[:-1]
|
||||||
|
) # Remove Z and convert
|
||||||
|
except Exception as e:
|
||||||
|
_LOGGER.exception("Error parsing submissionTime: %s", e)
|
||||||
|
submissionTime = None
|
||||||
|
|
||||||
|
# Extract fbclid and gclid for conditional account ID lookup
|
||||||
|
fbclid = data.get("field:fbclid")
|
||||||
|
gclid = data.get("field:gclid")
|
||||||
|
|
||||||
|
# Get advertising account IDs conditionally based on fbclid/gclid presence
|
||||||
|
meta_account_id, google_account_id = get_advertising_account_ids(
|
||||||
|
config, hotel_code, fbclid, gclid
|
||||||
|
)
|
||||||
|
|
||||||
|
reservation = ReservationData(
|
||||||
|
unique_id=unique_id,
|
||||||
|
start_date=date.fromisoformat(start_date),
|
||||||
|
end_date=date.fromisoformat(end_date),
|
||||||
|
num_adults=num_adults,
|
||||||
|
num_children=num_children,
|
||||||
|
children_ages=children_ages,
|
||||||
|
hotel_code=hotel_code,
|
||||||
|
hotel_name=hotel_name,
|
||||||
|
offer=offer,
|
||||||
|
created_at=submissionTime,
|
||||||
|
utm_source=data.get("field:utm_source"),
|
||||||
|
utm_medium=data.get("field:utm_medium"),
|
||||||
|
utm_campaign=data.get("field:utm_campaign"),
|
||||||
|
utm_term=data.get("field:utm_term"),
|
||||||
|
utm_content=data.get("field:utm_content"),
|
||||||
|
user_comment=data.get("field:long_answer_3524", ""),
|
||||||
|
fbclid=fbclid,
|
||||||
|
gclid=gclid,
|
||||||
|
meta_account_id=meta_account_id,
|
||||||
|
google_account_id=google_account_id,
|
||||||
|
)
|
||||||
|
|
||||||
|
if reservation.md5_unique_id is None:
|
||||||
|
raise HTTPException(status_code=400, detail="Failed to generate md5_unique_id")
|
||||||
|
|
||||||
|
# Use ReservationService to create reservation
|
||||||
|
reservation_service = ReservationService(db_session)
|
||||||
|
try:
|
||||||
|
db_reservation = await reservation_service.create_reservation(
|
||||||
|
reservation, db_customer.id
|
||||||
|
)
|
||||||
|
except IntegrityError as e:
|
||||||
|
await db_session.rollback()
|
||||||
|
# Check if this is a duplicate (unique constraint violation)
|
||||||
|
error_msg = str(e.orig) if hasattr(e, 'orig') else str(e)
|
||||||
|
is_duplicate = any(keyword in error_msg.lower() for keyword in ['unique', 'duplicate', 'already exists'])
|
||||||
|
|
||||||
|
if is_duplicate:
|
||||||
|
_LOGGER.info(
|
||||||
|
"Duplicate reservation detected for unique_id=%s, skipping (this is expected for reprocessing)",
|
||||||
|
unique_id
|
||||||
|
)
|
||||||
|
return {
|
||||||
|
"status": "duplicate",
|
||||||
|
"message": "Reservation already exists (duplicate submission)",
|
||||||
|
"unique_id": unique_id,
|
||||||
|
"timestamp": timestamp,
|
||||||
|
}
|
||||||
|
else:
|
||||||
|
# Real integrity error (not a duplicate)
|
||||||
|
_LOGGER.exception("Database integrity error creating reservation: %s", e)
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=500, detail="Database error creating reservation"
|
||||||
|
) from e
|
||||||
|
|
||||||
|
async def push_event():
|
||||||
|
# Fire event for listeners (push, etc.) - hotel-specific dispatch
|
||||||
|
if event_dispatcher:
|
||||||
|
# Get hotel_code from reservation to target the right listeners
|
||||||
|
hotel_code = getattr(db_reservation, "hotel_code", None)
|
||||||
|
if hotel_code and hotel_code.strip():
|
||||||
|
await event_dispatcher.dispatch_for_hotel(
|
||||||
|
"form_processed", hotel_code, db_customer, db_reservation
|
||||||
|
)
|
||||||
|
_LOGGER.info("Dispatched form_processed event for hotel %s", hotel_code)
|
||||||
|
else:
|
||||||
|
_LOGGER.warning(
|
||||||
|
"No hotel_code in reservation, skipping push notifications"
|
||||||
|
)
|
||||||
|
|
||||||
|
# Create task and store reference to prevent it from being garbage collected
|
||||||
|
# The task runs independently and we don't need to await it here
|
||||||
|
task = asyncio.create_task(push_event())
|
||||||
|
# Add done callback to log any exceptions that occur in the background task
|
||||||
|
task.add_done_callback(lambda t: t.exception() if not t.cancelled() else None)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"status": "success",
|
||||||
|
"message": "Wix form data received successfully",
|
||||||
|
"received_keys": list(data.keys()),
|
||||||
|
"timestamp": timestamp,
|
||||||
|
"note": "No authentication required for this endpoint",
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
class WixFormProcessor:
|
||||||
|
"""Processor for Wix form webhooks."""
|
||||||
|
|
||||||
|
@property
|
||||||
|
def webhook_type(self) -> str:
|
||||||
|
"""Return webhook type identifier."""
|
||||||
|
return "wix_form"
|
||||||
|
|
||||||
|
async def process(
|
||||||
|
self,
|
||||||
|
webhook_request: WebhookRequest,
|
||||||
|
db_session: AsyncSession,
|
||||||
|
config: dict[str, Any] | None = None,
|
||||||
|
event_dispatcher: Any | None = None,
|
||||||
|
) -> dict[str, Any]:
|
||||||
|
"""Process Wix form webhook payload.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
webhook_request: WebhookRequest database record
|
||||||
|
db_session: Database session
|
||||||
|
config: Application configuration (optional)
|
||||||
|
event_dispatcher: Event dispatcher for push notifications (optional)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Response dict with status and details
|
||||||
|
|
||||||
|
"""
|
||||||
|
# Call processing function with data from webhook_request
|
||||||
|
result = await process_wix_form_submission(
|
||||||
|
data=webhook_request.payload_json,
|
||||||
|
db_session=db_session,
|
||||||
|
config=config,
|
||||||
|
hotel_id=webhook_request.hotel_id,
|
||||||
|
event_dispatcher=event_dispatcher,
|
||||||
|
)
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
async def process_generic_webhook_submission(
|
||||||
|
data: dict[str, Any],
|
||||||
|
db_session: AsyncSession,
|
||||||
|
config: dict[str, Any] | None = None,
|
||||||
|
hotel_id: str | None = None,
|
||||||
|
event_dispatcher: Any | None = None,
|
||||||
|
):
|
||||||
|
"""Process generic webhook submissions with nested structure.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
data: Webhook payload data
|
||||||
|
db_session: Database session
|
||||||
|
config: Application config (optional)
|
||||||
|
hotel_id: Hotel ID (optional, will use from data or config default)
|
||||||
|
event_dispatcher: Event dispatcher for push notifications (optional)
|
||||||
|
|
||||||
|
Expected structure:
|
||||||
|
{
|
||||||
|
"hotel_data": {"hotelname": "...", "hotelcode": "..."},
|
||||||
|
"form_data": {
|
||||||
|
"sprache": "de/it/en",
|
||||||
|
"anreise": "DD.MM.YYYY",
|
||||||
|
"abreise": "DD.MM.YYYY",
|
||||||
|
"erwachsene": "N",
|
||||||
|
"kinder": "N",
|
||||||
|
"alter": {"1": "age1", "2": "age2", ...},
|
||||||
|
"anrede": "...",
|
||||||
|
"name": "...",
|
||||||
|
"nachname": "...",
|
||||||
|
"mail": "...",
|
||||||
|
"tel": "...",
|
||||||
|
"nachricht": "..."
|
||||||
|
},
|
||||||
|
"tracking_data": {
|
||||||
|
"utm_source": "...",
|
||||||
|
"utm_medium": "...",
|
||||||
|
"utm_campaign": "...",
|
||||||
|
"utm_content": "...",
|
||||||
|
"utm_term": "...",
|
||||||
|
"fbclid": "...",
|
||||||
|
"gclid": "..."
|
||||||
|
},
|
||||||
|
"timestamp": "ISO8601"
|
||||||
|
}
|
||||||
|
"""
|
||||||
|
timestamp = datetime.now().isoformat()
|
||||||
|
_LOGGER.info("Processing generic webhook submission at %s", timestamp)
|
||||||
|
|
||||||
|
# Provide fallback config if still None
|
||||||
|
if config is None:
|
||||||
|
config = {}
|
||||||
|
|
||||||
|
# Extract nested data
|
||||||
|
hotel_data = data.get("hotel_data", {})
|
||||||
|
form_data = data.get("form_data", {})
|
||||||
|
tracking_data = data.get("tracking_data", {})
|
||||||
|
offer_data = form_data.get("unterkunftTyp", {})
|
||||||
|
|
||||||
|
selected_offers = []
|
||||||
|
|
||||||
|
if offer_data:
|
||||||
|
# grab keys and values. If value is "on" add the key not the value to a list of selected offers
|
||||||
|
|
||||||
|
offer_data: dict[str, str]
|
||||||
|
|
||||||
|
for key, value in offer_data.items():
|
||||||
|
if value == "on":
|
||||||
|
selected_offers.append(key)
|
||||||
|
|
||||||
|
selected_offers_str = ", ".join(selected_offers) if selected_offers else None
|
||||||
|
|
||||||
|
# Extract hotel information
|
||||||
|
# Priority: 1) Passed hotel_id, 2) Webhook data, 3) Config default, 4) Fallback
|
||||||
|
hotel_code = hotel_id or hotel_data.get("hotelcode")
|
||||||
|
hotel_name = hotel_data.get("hotelname")
|
||||||
|
|
||||||
|
if not hotel_code:
|
||||||
|
_LOGGER.warning("No hotel_code provided, using default from config")
|
||||||
|
hotel_code = config.get("default_hotel_code", "123")
|
||||||
|
|
||||||
|
if not hotel_name:
|
||||||
|
hotel_name = config.get("default_hotel_name") or "Frangart Inn"
|
||||||
|
|
||||||
|
# Extract customer information
|
||||||
|
first_name = form_data.get("name")
|
||||||
|
last_name = form_data.get("nachname")
|
||||||
|
email = form_data.get("mail")
|
||||||
|
phone_number = form_data.get("tel")
|
||||||
|
name_prefix = form_data.get("anrede")
|
||||||
|
language = form_data.get("sprache", "de")[:2]
|
||||||
|
user_comment = form_data.get("nachricht", "")
|
||||||
|
plz = form_data.get("plz", "")
|
||||||
|
city = form_data.get("stadt", "")
|
||||||
|
country = form_data.get("land", "")
|
||||||
|
|
||||||
|
# Parse dates - handle DD.MM.YYYY format
|
||||||
|
start_date_str = form_data.get("anreise")
|
||||||
|
end_date_str = form_data.get("abreise")
|
||||||
|
|
||||||
|
if not start_date_str or not end_date_str:
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=400, detail="Missing required dates (anreise/abreise)"
|
||||||
|
)
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Parse DD.MM.YYYY format using strptime
|
||||||
|
start_date = datetime.strptime(start_date_str, "%d.%m.%Y").date()
|
||||||
|
end_date = datetime.strptime(end_date_str, "%d.%m.%Y").date()
|
||||||
|
except ValueError as e:
|
||||||
|
_LOGGER.error(
|
||||||
|
"Error parsing dates: start=%s, end=%s, error=%s",
|
||||||
|
start_date_str,
|
||||||
|
end_date_str,
|
||||||
|
e,
|
||||||
|
)
|
||||||
|
raise HTTPException(status_code=400, detail=f"Invalid date format: {e}") from e
|
||||||
|
|
||||||
|
# Extract room/guest info
|
||||||
|
num_adults = int(form_data.get("erwachsene", 2))
|
||||||
|
num_children = int(form_data.get("kinder", 0))
|
||||||
|
|
||||||
|
# Extract children ages from nested structure
|
||||||
|
children_ages = []
|
||||||
|
if num_children > 0:
|
||||||
|
alter_data = form_data.get("alter", {})
|
||||||
|
for i in range(1, num_children + 1):
|
||||||
|
age_str = alter_data.get(str(i))
|
||||||
|
if age_str:
|
||||||
|
try:
|
||||||
|
children_ages.append(int(age_str))
|
||||||
|
except ValueError:
|
||||||
|
_LOGGER.warning("Invalid age value for child %d: %s", i, age_str)
|
||||||
|
|
||||||
|
# Extract tracking information
|
||||||
|
utm_source = None
|
||||||
|
utm_medium = None
|
||||||
|
utm_campaign = None
|
||||||
|
utm_term = None
|
||||||
|
utm_content = None
|
||||||
|
fbclid = None
|
||||||
|
gclid = None
|
||||||
|
|
||||||
|
if tracking_data:
|
||||||
|
utm_source = tracking_data.get("utm_source")
|
||||||
|
utm_medium = tracking_data.get("utm_medium")
|
||||||
|
utm_campaign = tracking_data.get("utm_campaign")
|
||||||
|
utm_term = tracking_data.get("utm_term")
|
||||||
|
utm_content = tracking_data.get("utm_content")
|
||||||
|
fbclid = tracking_data.get("fbclid")
|
||||||
|
gclid = tracking_data.get("gclid")
|
||||||
|
|
||||||
|
# Parse submission timestamp
|
||||||
|
submission_time = data.get("timestamp")
|
||||||
|
try:
|
||||||
|
if submission_time:
|
||||||
|
# Handle ISO8601 format with timezone
|
||||||
|
if submission_time.endswith("Z"):
|
||||||
|
submission_time = datetime.fromisoformat(submission_time[:-1])
|
||||||
|
elif "+" in submission_time:
|
||||||
|
# Remove timezone info (e.g., +02:00)
|
||||||
|
submission_time = datetime.fromisoformat(submission_time.split("+")[0])
|
||||||
|
else:
|
||||||
|
submission_time = datetime.fromisoformat(submission_time)
|
||||||
|
except Exception as e:
|
||||||
|
_LOGGER.exception("Error parsing submission timestamp: %s", e)
|
||||||
|
submission_time = None
|
||||||
|
|
||||||
|
# Generate unique ID
|
||||||
|
unique_id = generate_unique_id()
|
||||||
|
|
||||||
|
# Use CustomerService to handle customer creation/update with hashing
|
||||||
|
customer_service = CustomerService(db_session)
|
||||||
|
|
||||||
|
customer_data = {
|
||||||
|
"given_name": first_name,
|
||||||
|
"surname": last_name,
|
||||||
|
"contact_id": None,
|
||||||
|
"name_prefix": name_prefix if name_prefix != "--" else None,
|
||||||
|
"email_address": email,
|
||||||
|
"phone": phone_number if phone_number else None,
|
||||||
|
"email_newsletter": False,
|
||||||
|
"address_line": None,
|
||||||
|
"city_name": city if city else None,
|
||||||
|
"postal_code": plz if plz else None,
|
||||||
|
"country_code": country if country else None,
|
||||||
|
"gender": None,
|
||||||
|
"birth_date": None,
|
||||||
|
"language": language,
|
||||||
|
"address_catalog": False,
|
||||||
|
"name_title": None,
|
||||||
|
}
|
||||||
|
|
||||||
|
# Create/update customer
|
||||||
|
db_customer = await customer_service.get_or_create_customer(customer_data)
|
||||||
|
|
||||||
|
# Get advertising account IDs conditionally based on fbclid/gclid presence
|
||||||
|
meta_account_id, google_account_id = get_advertising_account_ids(
|
||||||
|
config, hotel_code, fbclid, gclid
|
||||||
|
)
|
||||||
|
|
||||||
|
# Create reservation
|
||||||
|
reservation_kwargs = {
|
||||||
|
"unique_id": unique_id,
|
||||||
|
"start_date": start_date,
|
||||||
|
"end_date": end_date,
|
||||||
|
"num_adults": num_adults,
|
||||||
|
"num_children": num_children,
|
||||||
|
"children_ages": children_ages,
|
||||||
|
"hotel_code": hotel_code,
|
||||||
|
"hotel_name": hotel_name,
|
||||||
|
"offer": selected_offers_str,
|
||||||
|
"utm_source": utm_source,
|
||||||
|
"utm_medium": utm_medium,
|
||||||
|
"utm_campaign": utm_campaign,
|
||||||
|
"utm_term": utm_term,
|
||||||
|
"utm_content": utm_content,
|
||||||
|
"user_comment": user_comment,
|
||||||
|
"fbclid": fbclid,
|
||||||
|
"gclid": gclid,
|
||||||
|
"meta_account_id": meta_account_id,
|
||||||
|
"google_account_id": google_account_id,
|
||||||
|
}
|
||||||
|
|
||||||
|
# Only include created_at if we have a valid submission_time
|
||||||
|
if submission_time:
|
||||||
|
reservation_kwargs["created_at"] = submission_time
|
||||||
|
|
||||||
|
reservation = ReservationData(**reservation_kwargs)
|
||||||
|
|
||||||
|
if reservation.md5_unique_id is None:
|
||||||
|
raise HTTPException(status_code=400, detail="Failed to generate md5_unique_id")
|
||||||
|
|
||||||
|
# Use ReservationService to create reservation
|
||||||
|
reservation_service = ReservationService(db_session)
|
||||||
|
try:
|
||||||
|
db_reservation = await reservation_service.create_reservation(
|
||||||
|
reservation, db_customer.id
|
||||||
|
)
|
||||||
|
except IntegrityError as e:
|
||||||
|
await db_session.rollback()
|
||||||
|
# Check if this is a duplicate (unique constraint violation)
|
||||||
|
error_msg = str(e.orig) if hasattr(e, 'orig') else str(e)
|
||||||
|
is_duplicate = any(keyword in error_msg.lower() for keyword in ['unique', 'duplicate', 'already exists'])
|
||||||
|
|
||||||
|
if is_duplicate:
|
||||||
|
_LOGGER.info(
|
||||||
|
"Duplicate reservation detected for unique_id=%s, skipping (this is expected for reprocessing)",
|
||||||
|
unique_id
|
||||||
|
)
|
||||||
|
return {
|
||||||
|
"status": "duplicate",
|
||||||
|
"message": "Reservation already exists (duplicate submission)",
|
||||||
|
"unique_id": unique_id,
|
||||||
|
"timestamp": timestamp,
|
||||||
|
}
|
||||||
|
else:
|
||||||
|
# Real integrity error (not a duplicate)
|
||||||
|
_LOGGER.exception("Database integrity error creating reservation: %s", e)
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=500, detail="Database error creating reservation"
|
||||||
|
) from e
|
||||||
|
|
||||||
|
async def push_event():
|
||||||
|
# Fire event for listeners (push, etc.) - hotel-specific dispatch
|
||||||
|
if event_dispatcher:
|
||||||
|
# Get hotel_code from reservation to target the right listeners
|
||||||
|
hotel_code = getattr(db_reservation, "hotel_code", None)
|
||||||
|
if hotel_code and hotel_code.strip():
|
||||||
|
await event_dispatcher.dispatch_for_hotel(
|
||||||
|
"form_processed", hotel_code, db_customer, db_reservation
|
||||||
|
)
|
||||||
|
_LOGGER.info("Dispatched form_processed event for hotel %s", hotel_code)
|
||||||
|
else:
|
||||||
|
_LOGGER.warning(
|
||||||
|
"No hotel_code in reservation, skipping push notifications"
|
||||||
|
)
|
||||||
|
|
||||||
|
# Create task and store reference to prevent garbage collection
|
||||||
|
task = asyncio.create_task(push_event())
|
||||||
|
# Add done callback to log any exceptions
|
||||||
|
task.add_done_callback(lambda t: t.exception() if not t.cancelled() else None)
|
||||||
|
|
||||||
|
_LOGGER.info(
|
||||||
|
"Successfully processed generic webhook: customer_id=%s, reservation_id=%s",
|
||||||
|
db_customer.id,
|
||||||
|
db_reservation.id,
|
||||||
|
)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"status": "success",
|
||||||
|
"message": "Generic webhook data processed successfully",
|
||||||
|
"customer_id": db_customer.id,
|
||||||
|
"reservation_id": db_reservation.id,
|
||||||
|
"timestamp": timestamp,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
class GenericWebhookProcessor:
|
||||||
|
"""Processor for generic webhooks."""
|
||||||
|
|
||||||
|
@property
|
||||||
|
def webhook_type(self) -> str:
|
||||||
|
"""Return webhook type identifier."""
|
||||||
|
return "generic"
|
||||||
|
|
||||||
|
async def process(
|
||||||
|
self,
|
||||||
|
webhook_request: WebhookRequest,
|
||||||
|
db_session: AsyncSession,
|
||||||
|
config: dict[str, Any] | None = None,
|
||||||
|
event_dispatcher: Any | None = None,
|
||||||
|
) -> dict[str, Any]:
|
||||||
|
"""Process generic webhook payload.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
webhook_request: WebhookRequest database record
|
||||||
|
db_session: Database session
|
||||||
|
config: Application configuration (optional)
|
||||||
|
event_dispatcher: Event dispatcher for push notifications (optional)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Response dict with status and details
|
||||||
|
|
||||||
|
"""
|
||||||
|
# Call processing function with data from webhook_request
|
||||||
|
result = await process_generic_webhook_submission(
|
||||||
|
data=webhook_request.payload_json,
|
||||||
|
db_session=db_session,
|
||||||
|
config=config,
|
||||||
|
hotel_id=webhook_request.hotel_id,
|
||||||
|
event_dispatcher=event_dispatcher,
|
||||||
|
)
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
# Global registry instance
|
||||||
|
webhook_registry = WebhookProcessorRegistry()
|
||||||
|
|
||||||
|
|
||||||
|
def initialize_webhook_processors() -> None:
|
||||||
|
"""Initialize and register all webhook processors.
|
||||||
|
|
||||||
|
This should be called during application startup.
|
||||||
|
"""
|
||||||
|
# Register built-in processors
|
||||||
|
webhook_registry.register(WixFormProcessor())
|
||||||
|
webhook_registry.register(GenericWebhookProcessor())
|
||||||
|
|
||||||
|
_LOGGER.info("Webhook processors initialized")
|
||||||
197
tests/helpers/README.md
Normal file
197
tests/helpers/README.md
Normal file
@@ -0,0 +1,197 @@
|
|||||||
|
# Test Helpers
|
||||||
|
|
||||||
|
This directory contains helper utilities for creating test data.
|
||||||
|
|
||||||
|
## XML Builders
|
||||||
|
|
||||||
|
The `xml_builders` module provides convenient builder classes for creating reservation XML structures used in conversion service tests.
|
||||||
|
|
||||||
|
### Quick Start
|
||||||
|
|
||||||
|
```python
|
||||||
|
from tests.helpers import ReservationXMLBuilder
|
||||||
|
|
||||||
|
# Create a simple reservation
|
||||||
|
xml = (
|
||||||
|
ReservationXMLBuilder(
|
||||||
|
hotel_id="39054_001",
|
||||||
|
reservation_id="12345",
|
||||||
|
reservation_number="RES-001",
|
||||||
|
reservation_date="2025-11-14",
|
||||||
|
)
|
||||||
|
.set_guest(
|
||||||
|
guest_id="guest_001",
|
||||||
|
first_name="John",
|
||||||
|
last_name="Doe",
|
||||||
|
email="john@example.com",
|
||||||
|
)
|
||||||
|
.add_room(
|
||||||
|
arrival="2025-12-01",
|
||||||
|
departure="2025-12-05",
|
||||||
|
revenue_logis_per_day=150.0, # Fixed revenue per night
|
||||||
|
)
|
||||||
|
.build_xml()
|
||||||
|
)
|
||||||
|
```
|
||||||
|
|
||||||
|
### Features
|
||||||
|
|
||||||
|
#### ReservationXMLBuilder
|
||||||
|
|
||||||
|
The main builder class for creating reservation XML structures.
|
||||||
|
|
||||||
|
**Key Features:**
|
||||||
|
- Fluent API for method chaining
|
||||||
|
- Automatic daily sales generation from arrival to departure
|
||||||
|
- Convenient revenue-per-day specification (no need to manually create each dailySale)
|
||||||
|
- Support for advertising campaign data
|
||||||
|
- Guest information with optional fields
|
||||||
|
|
||||||
|
**Example - Multi-room reservation:**
|
||||||
|
|
||||||
|
```python
|
||||||
|
xml = (
|
||||||
|
ReservationXMLBuilder(
|
||||||
|
hotel_id="39054_001",
|
||||||
|
reservation_id="12345",
|
||||||
|
reservation_number="RES-001",
|
||||||
|
reservation_date="2025-11-14",
|
||||||
|
)
|
||||||
|
.set_guest(
|
||||||
|
guest_id="guest_001",
|
||||||
|
first_name="Jane",
|
||||||
|
last_name="Smith",
|
||||||
|
email="jane@example.com",
|
||||||
|
country_code="US",
|
||||||
|
)
|
||||||
|
.add_room(
|
||||||
|
arrival="2025-12-01",
|
||||||
|
departure="2025-12-05",
|
||||||
|
room_number="101",
|
||||||
|
room_type="DZV",
|
||||||
|
revenue_logis_per_day=150.0,
|
||||||
|
)
|
||||||
|
.add_room(
|
||||||
|
arrival="2025-12-01",
|
||||||
|
departure="2025-12-05",
|
||||||
|
room_number="102",
|
||||||
|
room_type="DZM",
|
||||||
|
revenue_logis_per_day=200.0,
|
||||||
|
)
|
||||||
|
.build_xml()
|
||||||
|
)
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Daily Sales Generation
|
||||||
|
|
||||||
|
The builder automatically generates `<dailySale>` entries for each day from arrival to departure (inclusive).
|
||||||
|
|
||||||
|
- **Days before departure**: Include `revenueTotal` and `revenueLogis` attributes
|
||||||
|
- **Departure day**: No revenue attributes (just the date)
|
||||||
|
|
||||||
|
**Example:**
|
||||||
|
```python
|
||||||
|
# A 3-night stay (Dec 1-4)
|
||||||
|
.add_room(
|
||||||
|
arrival="2025-12-01",
|
||||||
|
departure="2025-12-04",
|
||||||
|
revenue_logis_per_day=160.0,
|
||||||
|
)
|
||||||
|
```
|
||||||
|
|
||||||
|
Generates:
|
||||||
|
```xml
|
||||||
|
<dailySales>
|
||||||
|
<dailySale date="2025-12-01" revenueTotal="160.0" revenueLogis="160.0"/>
|
||||||
|
<dailySale date="2025-12-02" revenueTotal="160.0" revenueLogis="160.0"/>
|
||||||
|
<dailySale date="2025-12-03" revenueTotal="160.0" revenueLogis="160.0"/>
|
||||||
|
<dailySale date="2025-12-04"/> <!-- No revenue on departure day -->
|
||||||
|
</dailySales>
|
||||||
|
```
|
||||||
|
|
||||||
|
#### MultiReservationXMLBuilder
|
||||||
|
|
||||||
|
For creating XML documents with multiple reservations:
|
||||||
|
|
||||||
|
```python
|
||||||
|
from tests.helpers import ReservationXMLBuilder, MultiReservationXMLBuilder
|
||||||
|
|
||||||
|
multi_builder = MultiReservationXMLBuilder()
|
||||||
|
|
||||||
|
# Add first reservation
|
||||||
|
res1 = (
|
||||||
|
ReservationXMLBuilder(...)
|
||||||
|
.set_guest(...)
|
||||||
|
.add_room(...)
|
||||||
|
)
|
||||||
|
multi_builder.add_reservation(res1)
|
||||||
|
|
||||||
|
# Add second reservation
|
||||||
|
res2 = (
|
||||||
|
ReservationXMLBuilder(...)
|
||||||
|
.set_guest(...)
|
||||||
|
.add_room(...)
|
||||||
|
)
|
||||||
|
multi_builder.add_reservation(res2)
|
||||||
|
|
||||||
|
xml = multi_builder.build_xml()
|
||||||
|
```
|
||||||
|
|
||||||
|
#### RoomReservationBuilder
|
||||||
|
|
||||||
|
Low-level builder for creating individual room reservations. Usually you'll use `ReservationXMLBuilder.add_room()` instead, but this is available for advanced use cases.
|
||||||
|
|
||||||
|
```python
|
||||||
|
from tests.helpers import RoomReservationBuilder
|
||||||
|
|
||||||
|
room_builder = RoomReservationBuilder(
|
||||||
|
arrival="2025-12-01",
|
||||||
|
departure="2025-12-05",
|
||||||
|
room_type="DZV",
|
||||||
|
room_number="101",
|
||||||
|
revenue_logis_per_day=150.0,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Get the XML element (not a string)
|
||||||
|
room_elem = room_builder.build()
|
||||||
|
```
|
||||||
|
|
||||||
|
### Common Parameters
|
||||||
|
|
||||||
|
**ReservationXMLBuilder:**
|
||||||
|
- `hotel_id` - Hotel ID (required)
|
||||||
|
- `reservation_id` - Reservation ID (required)
|
||||||
|
- `reservation_number` - Reservation number (required)
|
||||||
|
- `reservation_date` - Reservation date YYYY-MM-DD (required)
|
||||||
|
- `creation_time` - Creation timestamp (optional, defaults to reservation_date + T00:00:00)
|
||||||
|
- `advertising_medium` - Advertising medium (optional)
|
||||||
|
- `advertising_partner` - Advertising partner (optional)
|
||||||
|
- `advertising_campagne` - Advertising campaign (optional)
|
||||||
|
|
||||||
|
**set_guest() parameters:**
|
||||||
|
- `guest_id` - Guest ID (required)
|
||||||
|
- `first_name` - First name (required)
|
||||||
|
- `last_name` - Last name (required)
|
||||||
|
- `email` - Email address (required)
|
||||||
|
- `language` - Language code (default: "en")
|
||||||
|
- `gender` - Gender (optional)
|
||||||
|
- `country_code` - Country code (optional)
|
||||||
|
- `country` - Country name (optional)
|
||||||
|
|
||||||
|
**add_room() parameters:**
|
||||||
|
- `arrival` - Arrival date YYYY-MM-DD (required)
|
||||||
|
- `departure` - Departure date YYYY-MM-DD (required)
|
||||||
|
- `room_type` - Room type code (default: "DZV")
|
||||||
|
- `room_number` - Room number (default: "101")
|
||||||
|
- `status` - Reservation status (default: "reserved")
|
||||||
|
- `adults` - Number of adults (default: 2)
|
||||||
|
- `children` - Number of children (default: 0)
|
||||||
|
- `infants` - Number of infants (default: 0)
|
||||||
|
- `rate_plan_code` - Rate plan code (default: "STANDARD")
|
||||||
|
- `revenue_logis_per_day` - Fixed revenue per night (optional, generates daily sales)
|
||||||
|
- `revenue_total_per_day` - Total revenue per night (optional, defaults to revenue_logis_per_day)
|
||||||
|
|
||||||
|
### See Also
|
||||||
|
|
||||||
|
- [tests/test_xml_builders.py](../test_xml_builders.py) - Unit tests demonstrating all features
|
||||||
|
- [tests/test_conversion_service.py](../test_conversion_service.py) - Integration examples (TestXMLBuilderUsage class)
|
||||||
13
tests/helpers/__init__.py
Normal file
13
tests/helpers/__init__.py
Normal file
@@ -0,0 +1,13 @@
|
|||||||
|
"""Test helper utilities for creating test data."""
|
||||||
|
|
||||||
|
from .xml_builders import (
|
||||||
|
ReservationXMLBuilder,
|
||||||
|
MultiReservationXMLBuilder,
|
||||||
|
RoomReservationBuilder,
|
||||||
|
)
|
||||||
|
|
||||||
|
__all__ = [
|
||||||
|
"ReservationXMLBuilder",
|
||||||
|
"MultiReservationXMLBuilder",
|
||||||
|
"RoomReservationBuilder",
|
||||||
|
]
|
||||||
392
tests/helpers/xml_builders.py
Normal file
392
tests/helpers/xml_builders.py
Normal file
@@ -0,0 +1,392 @@
|
|||||||
|
"""XML builder helpers for creating test reservation data.
|
||||||
|
|
||||||
|
This module provides convenient builder classes for generating reservation XML
|
||||||
|
structures used in conversion service tests.
|
||||||
|
"""
|
||||||
|
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
from typing import Optional
|
||||||
|
from xml.etree import ElementTree as ET
|
||||||
|
|
||||||
|
|
||||||
|
class RoomReservationBuilder:
|
||||||
|
"""Builder for creating roomReservation XML elements with daily sales."""
|
||||||
|
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
arrival: str,
|
||||||
|
departure: str,
|
||||||
|
room_type: str = "DZV",
|
||||||
|
room_number: str = "101",
|
||||||
|
status: str = "reserved",
|
||||||
|
adults: int = 2,
|
||||||
|
children: int = 0,
|
||||||
|
infants: int = 0,
|
||||||
|
rate_plan_code: str = "STANDARD",
|
||||||
|
connected_room_type: str = "0",
|
||||||
|
revenue_logis_per_day: Optional[float] = None,
|
||||||
|
revenue_total_per_day: Optional[float] = None,
|
||||||
|
):
|
||||||
|
"""Initialize room reservation builder.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
arrival: Arrival date in YYYY-MM-DD format
|
||||||
|
departure: Departure date in YYYY-MM-DD format
|
||||||
|
room_type: Room type code
|
||||||
|
room_number: Room number
|
||||||
|
status: Reservation status (reserved, request, confirmed, etc.)
|
||||||
|
adults: Number of adults
|
||||||
|
children: Number of children
|
||||||
|
infants: Number of infants
|
||||||
|
rate_plan_code: Rate plan code
|
||||||
|
connected_room_type: Connected room type code
|
||||||
|
revenue_logis_per_day: Revenue per day (if None, no revenue attributes)
|
||||||
|
revenue_total_per_day: Total revenue per day (defaults to revenue_logis_per_day)
|
||||||
|
"""
|
||||||
|
self.arrival = arrival
|
||||||
|
self.departure = departure
|
||||||
|
self.room_type = room_type
|
||||||
|
self.room_number = room_number
|
||||||
|
self.status = status
|
||||||
|
self.adults = adults
|
||||||
|
self.children = children
|
||||||
|
self.infants = infants
|
||||||
|
self.rate_plan_code = rate_plan_code
|
||||||
|
self.connected_room_type = connected_room_type
|
||||||
|
self.revenue_logis_per_day = revenue_logis_per_day
|
||||||
|
self.revenue_total_per_day = revenue_total_per_day or revenue_logis_per_day
|
||||||
|
|
||||||
|
def build(self) -> ET.Element:
|
||||||
|
"""Build the roomReservation XML element with daily sales.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
XML Element for the room reservation
|
||||||
|
"""
|
||||||
|
room_attrs = {
|
||||||
|
"arrival": self.arrival,
|
||||||
|
"departure": self.departure,
|
||||||
|
"status": self.status,
|
||||||
|
"roomType": self.room_type,
|
||||||
|
"roomNumber": self.room_number,
|
||||||
|
"adults": str(self.adults),
|
||||||
|
"ratePlanCode": self.rate_plan_code,
|
||||||
|
"connectedRoomType": self.connected_room_type,
|
||||||
|
}
|
||||||
|
|
||||||
|
if self.children > 0:
|
||||||
|
room_attrs["children"] = str(self.children)
|
||||||
|
if self.infants > 0:
|
||||||
|
room_attrs["infants"] = str(self.infants)
|
||||||
|
|
||||||
|
room_elem = ET.Element("roomReservation", room_attrs)
|
||||||
|
|
||||||
|
# Create dailySales element
|
||||||
|
daily_sales_elem = ET.SubElement(room_elem, "dailySales")
|
||||||
|
|
||||||
|
# Generate daily sale entries from arrival to departure (inclusive of departure for the no-revenue entry)
|
||||||
|
arrival_date = datetime.strptime(self.arrival, "%Y-%m-%d")
|
||||||
|
departure_date = datetime.strptime(self.departure, "%Y-%m-%d")
|
||||||
|
|
||||||
|
current_date = arrival_date
|
||||||
|
while current_date <= departure_date:
|
||||||
|
date_str = current_date.strftime("%Y-%m-%d")
|
||||||
|
daily_sale_attrs = {"date": date_str}
|
||||||
|
|
||||||
|
# Add revenue attributes for all days except departure day
|
||||||
|
if current_date < departure_date and self.revenue_logis_per_day is not None:
|
||||||
|
daily_sale_attrs["revenueTotal"] = str(self.revenue_total_per_day)
|
||||||
|
daily_sale_attrs["revenueLogis"] = str(self.revenue_logis_per_day)
|
||||||
|
|
||||||
|
ET.SubElement(daily_sales_elem, "dailySale", daily_sale_attrs)
|
||||||
|
current_date += timedelta(days=1)
|
||||||
|
|
||||||
|
return room_elem
|
||||||
|
|
||||||
|
|
||||||
|
class ReservationXMLBuilder:
|
||||||
|
"""Builder for creating complete reservation XML structures for testing.
|
||||||
|
|
||||||
|
This builder provides a fluent interface for constructing reservation XML
|
||||||
|
that matches the format expected by the ConversionService.
|
||||||
|
|
||||||
|
Example usage:
|
||||||
|
builder = ReservationXMLBuilder(
|
||||||
|
hotel_id="39054_001",
|
||||||
|
reservation_id="12345",
|
||||||
|
reservation_number="RES-001",
|
||||||
|
reservation_date="2025-11-14"
|
||||||
|
)
|
||||||
|
builder.set_guest(
|
||||||
|
guest_id="guest_001",
|
||||||
|
first_name="John",
|
||||||
|
last_name="Doe",
|
||||||
|
email="john@example.com"
|
||||||
|
)
|
||||||
|
builder.add_room(
|
||||||
|
arrival="2025-12-01",
|
||||||
|
departure="2025-12-05",
|
||||||
|
revenue_logis_per_day=150.0
|
||||||
|
)
|
||||||
|
xml_string = builder.build_xml()
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
hotel_id: str,
|
||||||
|
reservation_id: str,
|
||||||
|
reservation_number: str,
|
||||||
|
reservation_date: str,
|
||||||
|
creation_time: Optional[str] = None,
|
||||||
|
reservation_type: str = "reservation",
|
||||||
|
advertising_medium: Optional[str] = None,
|
||||||
|
advertising_partner: Optional[str] = None,
|
||||||
|
advertising_campagne: Optional[str] = None,
|
||||||
|
):
|
||||||
|
"""Initialize reservation builder.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
hotel_id: Hotel ID
|
||||||
|
reservation_id: Reservation ID
|
||||||
|
reservation_number: Reservation number
|
||||||
|
reservation_date: Reservation date in YYYY-MM-DD format
|
||||||
|
creation_time: Creation timestamp (defaults to reservation_date + T00:00:00)
|
||||||
|
reservation_type: Type of reservation (reservation, request, etc.)
|
||||||
|
advertising_medium: Advertising medium
|
||||||
|
advertising_partner: Advertising partner
|
||||||
|
advertising_campagne: Advertising campaign
|
||||||
|
"""
|
||||||
|
self.hotel_id = hotel_id
|
||||||
|
self.reservation_id = reservation_id
|
||||||
|
self.reservation_number = reservation_number
|
||||||
|
self.reservation_date = reservation_date
|
||||||
|
self.creation_time = creation_time or f"{reservation_date}T00:00:00"
|
||||||
|
self.reservation_type = reservation_type
|
||||||
|
self.advertising_medium = advertising_medium
|
||||||
|
self.advertising_partner = advertising_partner
|
||||||
|
self.advertising_campagne = advertising_campagne
|
||||||
|
|
||||||
|
self.guest_data: Optional[dict] = None
|
||||||
|
self.rooms: list[RoomReservationBuilder] = []
|
||||||
|
|
||||||
|
def set_guest(
|
||||||
|
self,
|
||||||
|
guest_id: str,
|
||||||
|
first_name: str,
|
||||||
|
last_name: str,
|
||||||
|
email: str,
|
||||||
|
language: str = "en",
|
||||||
|
gender: Optional[str] = None,
|
||||||
|
country_code: Optional[str] = None,
|
||||||
|
country: Optional[str] = None,
|
||||||
|
) -> "ReservationXMLBuilder":
|
||||||
|
"""Set guest information for the reservation.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
guest_id: Guest ID
|
||||||
|
first_name: Guest first name
|
||||||
|
last_name: Guest last name
|
||||||
|
email: Guest email
|
||||||
|
language: Guest language code
|
||||||
|
gender: Guest gender
|
||||||
|
country_code: Guest country code
|
||||||
|
country: Guest country name
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Self for method chaining
|
||||||
|
"""
|
||||||
|
self.guest_data = {
|
||||||
|
"id": guest_id,
|
||||||
|
"firstName": first_name,
|
||||||
|
"lastName": last_name,
|
||||||
|
"email": email,
|
||||||
|
"language": language,
|
||||||
|
}
|
||||||
|
if gender:
|
||||||
|
self.guest_data["gender"] = gender
|
||||||
|
if country_code:
|
||||||
|
self.guest_data["countryCode"] = country_code
|
||||||
|
if country:
|
||||||
|
self.guest_data["country"] = country
|
||||||
|
|
||||||
|
return self
|
||||||
|
|
||||||
|
def add_room(
|
||||||
|
self,
|
||||||
|
arrival: str,
|
||||||
|
departure: str,
|
||||||
|
room_type: str = "DZV",
|
||||||
|
room_number: str = "101",
|
||||||
|
status: str = "reserved",
|
||||||
|
adults: int = 2,
|
||||||
|
children: int = 0,
|
||||||
|
infants: int = 0,
|
||||||
|
rate_plan_code: str = "STANDARD",
|
||||||
|
connected_room_type: str = "0",
|
||||||
|
revenue_logis_per_day: Optional[float] = None,
|
||||||
|
revenue_total_per_day: Optional[float] = None,
|
||||||
|
) -> "ReservationXMLBuilder":
|
||||||
|
"""Add a room reservation with convenient daily sales generation.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
arrival: Arrival date in YYYY-MM-DD format
|
||||||
|
departure: Departure date in YYYY-MM-DD format
|
||||||
|
room_type: Room type code
|
||||||
|
room_number: Room number
|
||||||
|
status: Reservation status
|
||||||
|
adults: Number of adults
|
||||||
|
children: Number of children
|
||||||
|
infants: Number of infants
|
||||||
|
rate_plan_code: Rate plan code
|
||||||
|
connected_room_type: Connected room type
|
||||||
|
revenue_logis_per_day: Fixed revenue per day (auto-generates dailySale entries)
|
||||||
|
revenue_total_per_day: Total revenue per day (defaults to revenue_logis_per_day)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Self for method chaining
|
||||||
|
"""
|
||||||
|
room_builder = RoomReservationBuilder(
|
||||||
|
arrival=arrival,
|
||||||
|
departure=departure,
|
||||||
|
room_type=room_type,
|
||||||
|
room_number=room_number,
|
||||||
|
status=status,
|
||||||
|
adults=adults,
|
||||||
|
children=children,
|
||||||
|
infants=infants,
|
||||||
|
rate_plan_code=rate_plan_code,
|
||||||
|
connected_room_type=connected_room_type,
|
||||||
|
revenue_logis_per_day=revenue_logis_per_day,
|
||||||
|
revenue_total_per_day=revenue_total_per_day,
|
||||||
|
)
|
||||||
|
self.rooms.append(room_builder)
|
||||||
|
return self
|
||||||
|
|
||||||
|
def add_room_builder(
|
||||||
|
self, room_builder: RoomReservationBuilder
|
||||||
|
) -> "ReservationXMLBuilder":
|
||||||
|
"""Add a pre-configured room builder.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
room_builder: RoomReservationBuilder instance
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Self for method chaining
|
||||||
|
"""
|
||||||
|
self.rooms.append(room_builder)
|
||||||
|
return self
|
||||||
|
|
||||||
|
def build(self) -> ET.Element:
|
||||||
|
"""Build the reservation XML element.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
XML Element for the reservation
|
||||||
|
"""
|
||||||
|
reservation_attrs = {
|
||||||
|
"hotelID": self.hotel_id,
|
||||||
|
"id": self.reservation_id,
|
||||||
|
"number": self.reservation_number,
|
||||||
|
"date": self.reservation_date,
|
||||||
|
"creationTime": self.creation_time,
|
||||||
|
"type": self.reservation_type,
|
||||||
|
}
|
||||||
|
|
||||||
|
if self.advertising_medium:
|
||||||
|
reservation_attrs["advertisingMedium"] = self.advertising_medium
|
||||||
|
if self.advertising_partner:
|
||||||
|
reservation_attrs["advertisingPartner"] = self.advertising_partner
|
||||||
|
if self.advertising_campagne:
|
||||||
|
reservation_attrs["advertisingCampagne"] = self.advertising_campagne
|
||||||
|
|
||||||
|
reservation_elem = ET.Element("reservation", reservation_attrs)
|
||||||
|
|
||||||
|
# Add guest element
|
||||||
|
if self.guest_data:
|
||||||
|
ET.SubElement(reservation_elem, "guest", self.guest_data)
|
||||||
|
|
||||||
|
# Add roomReservations
|
||||||
|
if self.rooms:
|
||||||
|
room_reservations_elem = ET.SubElement(
|
||||||
|
reservation_elem, "roomReservations"
|
||||||
|
)
|
||||||
|
for room_builder in self.rooms:
|
||||||
|
room_elem = room_builder.build()
|
||||||
|
room_reservations_elem.append(room_elem)
|
||||||
|
|
||||||
|
return reservation_elem
|
||||||
|
|
||||||
|
def build_xml(self, include_xml_declaration: bool = True) -> str:
|
||||||
|
"""Build the complete XML string for this reservation.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
include_xml_declaration: Whether to include <?xml version="1.0"?> declaration
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
XML string
|
||||||
|
"""
|
||||||
|
reservation_elem = self.build()
|
||||||
|
|
||||||
|
# Wrap in <reservations> root element
|
||||||
|
root = ET.Element("reservations")
|
||||||
|
root.append(reservation_elem)
|
||||||
|
|
||||||
|
xml_str = ET.tostring(root, encoding="unicode")
|
||||||
|
|
||||||
|
if include_xml_declaration:
|
||||||
|
xml_str = '<?xml version="1.0" ?>\n' + xml_str
|
||||||
|
|
||||||
|
return xml_str
|
||||||
|
|
||||||
|
|
||||||
|
class MultiReservationXMLBuilder:
|
||||||
|
"""Builder for creating XML documents with multiple reservations.
|
||||||
|
|
||||||
|
Example:
|
||||||
|
builder = MultiReservationXMLBuilder()
|
||||||
|
builder.add_reservation(
|
||||||
|
ReservationXMLBuilder(...).set_guest(...).add_room(...)
|
||||||
|
)
|
||||||
|
builder.add_reservation(
|
||||||
|
ReservationXMLBuilder(...).set_guest(...).add_room(...)
|
||||||
|
)
|
||||||
|
xml_string = builder.build_xml()
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self):
|
||||||
|
"""Initialize multi-reservation builder."""
|
||||||
|
self.reservations: list[ReservationXMLBuilder] = []
|
||||||
|
|
||||||
|
def add_reservation(
|
||||||
|
self, reservation_builder: ReservationXMLBuilder
|
||||||
|
) -> "MultiReservationXMLBuilder":
|
||||||
|
"""Add a reservation to the document.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
reservation_builder: ReservationXMLBuilder instance
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Self for method chaining
|
||||||
|
"""
|
||||||
|
self.reservations.append(reservation_builder)
|
||||||
|
return self
|
||||||
|
|
||||||
|
def build_xml(self, include_xml_declaration: bool = True) -> str:
|
||||||
|
"""Build the complete XML string with all reservations.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
include_xml_declaration: Whether to include <?xml version="1.0"?> declaration
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
XML string with multiple reservations
|
||||||
|
"""
|
||||||
|
root = ET.Element("reservations")
|
||||||
|
|
||||||
|
for reservation_builder in self.reservations:
|
||||||
|
reservation_elem = reservation_builder.build()
|
||||||
|
root.append(reservation_elem)
|
||||||
|
|
||||||
|
xml_str = ET.tostring(root, encoding="unicode")
|
||||||
|
|
||||||
|
if include_xml_declaration:
|
||||||
|
xml_str = '<?xml version="1.0" ?>\n' + xml_str
|
||||||
|
|
||||||
|
return xml_str
|
||||||
@@ -96,6 +96,12 @@ def test_config():
|
|||||||
"hotel_name": "Test Hotel",
|
"hotel_name": "Test Hotel",
|
||||||
"username": "testuser",
|
"username": "testuser",
|
||||||
"password": "testpass",
|
"password": "testpass",
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"hotel_id": "135",
|
||||||
|
"hotel_name": "Another Hotel",
|
||||||
|
"username": "anotheruser",
|
||||||
|
"password": "anotherpass",
|
||||||
}
|
}
|
||||||
],
|
],
|
||||||
"default_hotel_code": "HOTEL123",
|
"default_hotel_code": "HOTEL123",
|
||||||
@@ -295,7 +301,7 @@ class TestWixWebhookEndpoint:
|
|||||||
|
|
||||||
def test_wix_webhook_test_endpoint(self, client, sample_wix_form_data):
|
def test_wix_webhook_test_endpoint(self, client, sample_wix_form_data):
|
||||||
"""Test the test endpoint works identically."""
|
"""Test the test endpoint works identically."""
|
||||||
response = client.post("/api/webhook/wix-form/test", json=sample_wix_form_data)
|
response = client.post("/api/webhook/wix-form", json=sample_wix_form_data)
|
||||||
assert response.status_code == HttpStatusCode.OK
|
assert response.status_code == HttpStatusCode.OK
|
||||||
data = response.json()
|
data = response.json()
|
||||||
assert data["status"] == "success"
|
assert data["status"] == "success"
|
||||||
@@ -318,7 +324,7 @@ class TestWixWebhookEndpoint:
|
|||||||
with test_form_file.open() as f:
|
with test_form_file.open() as f:
|
||||||
form_data = json.load(f)
|
form_data = json.load(f)
|
||||||
|
|
||||||
response = client.post("/api/webhook/wix-form/test", json=form_data)
|
response = client.post("/api/webhook/wix-form", json=form_data)
|
||||||
assert response.status_code == HttpStatusCode.OK
|
assert response.status_code == HttpStatusCode.OK
|
||||||
data = response.json()
|
data = response.json()
|
||||||
assert data["status"] == "success"
|
assert data["status"] == "success"
|
||||||
@@ -417,7 +423,7 @@ class TestGenericWebhookEndpoint:
|
|||||||
"""Test successful generic webhook submission with real form data."""
|
"""Test successful generic webhook submission with real form data."""
|
||||||
unique_id = uuid.uuid4().hex[:8]
|
unique_id = uuid.uuid4().hex[:8]
|
||||||
test_data = {
|
test_data = {
|
||||||
"hotel_data": {"hotelname": "Bemelmans", "hotelcode": "39054_001"},
|
"hotel_data": {"hotelname": "Bemelmans", "hotelcode": "HOTEL123"},
|
||||||
"form_data": {
|
"form_data": {
|
||||||
"sprache": "it",
|
"sprache": "it",
|
||||||
"anreise": "14.10.2025",
|
"anreise": "14.10.2025",
|
||||||
@@ -451,14 +457,14 @@ class TestGenericWebhookEndpoint:
|
|||||||
assert "timestamp" in data
|
assert "timestamp" in data
|
||||||
assert (
|
assert (
|
||||||
data["message"]
|
data["message"]
|
||||||
== "Generic webhook data received and processed successfully"
|
== "Generic webhook data processed successfully"
|
||||||
)
|
)
|
||||||
|
|
||||||
def test_generic_webhook_creates_customer_and_reservation(self, client):
|
def test_generic_webhook_creates_customer_and_reservation(self, client):
|
||||||
"""Test that webhook creates customer and reservation in database."""
|
"""Test that webhook creates customer and reservation in database."""
|
||||||
unique_id = uuid.uuid4().hex[:8]
|
unique_id = uuid.uuid4().hex[:8]
|
||||||
test_data = {
|
test_data = {
|
||||||
"hotel_data": {"hotelname": "Test Hotel", "hotelcode": "TEST123"},
|
"hotel_data": {"hotelname": "Test Hotel", "hotelcode": "HOTEL123"},
|
||||||
"form_data": {
|
"form_data": {
|
||||||
"sprache": "de",
|
"sprache": "de",
|
||||||
"anreise": "25.12.2025",
|
"anreise": "25.12.2025",
|
||||||
@@ -517,7 +523,7 @@ class TestGenericWebhookEndpoint:
|
|||||||
(r for r in reservations if r.customer_id == customer.id), None
|
(r for r in reservations if r.customer_id == customer.id), None
|
||||||
)
|
)
|
||||||
assert reservation is not None, "Reservation should be created"
|
assert reservation is not None, "Reservation should be created"
|
||||||
assert reservation.hotel_code == "TEST123"
|
assert reservation.hotel_code == "HOTEL123"
|
||||||
assert reservation.hotel_name == "Test Hotel"
|
assert reservation.hotel_name == "Test Hotel"
|
||||||
assert reservation.num_adults == 2
|
assert reservation.num_adults == 2
|
||||||
assert reservation.num_children == 1
|
assert reservation.num_children == 1
|
||||||
@@ -537,7 +543,7 @@ class TestGenericWebhookEndpoint:
|
|||||||
def test_generic_webhook_missing_dates(self, client):
|
def test_generic_webhook_missing_dates(self, client):
|
||||||
"""Test webhook with missing required dates."""
|
"""Test webhook with missing required dates."""
|
||||||
test_data = {
|
test_data = {
|
||||||
"hotel_data": {"hotelname": "Test", "hotelcode": "123"},
|
"hotel_data": {"hotelname": "Test", "hotelcode": "HOTEL123"},
|
||||||
"form_data": {
|
"form_data": {
|
||||||
"sprache": "de",
|
"sprache": "de",
|
||||||
"name": "John",
|
"name": "John",
|
||||||
@@ -555,7 +561,7 @@ class TestGenericWebhookEndpoint:
|
|||||||
def test_generic_webhook_invalid_date_format(self, client):
|
def test_generic_webhook_invalid_date_format(self, client):
|
||||||
"""Test webhook with invalid date format."""
|
"""Test webhook with invalid date format."""
|
||||||
test_data = {
|
test_data = {
|
||||||
"hotel_data": {"hotelname": "Test", "hotelcode": "123"},
|
"hotel_data": {"hotelname": "Test", "hotelcode": "HOTEL123"},
|
||||||
"form_data": {
|
"form_data": {
|
||||||
"sprache": "en",
|
"sprache": "en",
|
||||||
"anreise": "2025-10-14", # Wrong format, should be DD.MM.YYYY
|
"anreise": "2025-10-14", # Wrong format, should be DD.MM.YYYY
|
||||||
@@ -577,7 +583,7 @@ class TestGenericWebhookEndpoint:
|
|||||||
"""Test webhook properly handles children ages."""
|
"""Test webhook properly handles children ages."""
|
||||||
unique_id = uuid.uuid4().hex[:8]
|
unique_id = uuid.uuid4().hex[:8]
|
||||||
test_data = {
|
test_data = {
|
||||||
"hotel_data": {"hotelname": "Family Hotel", "hotelcode": "FAM001"},
|
"hotel_data": {"hotelname": "Family Hotel", "hotelcode": "HOTEL123"},
|
||||||
"form_data": {
|
"form_data": {
|
||||||
"sprache": "it",
|
"sprache": "it",
|
||||||
"anreise": "01.08.2025",
|
"anreise": "01.08.2025",
|
||||||
@@ -608,9 +614,9 @@ class TestGenericWebhookEndpoint:
|
|||||||
result = await session.execute(select(Reservation))
|
result = await session.execute(select(Reservation))
|
||||||
reservations = result.scalars().all()
|
reservations = result.scalars().all()
|
||||||
reservation = next(
|
reservation = next(
|
||||||
(r for r in reservations if r.hotel_code == "FAM001"), None
|
(r for r in reservations if r.hotel_code == "HOTEL123"), None
|
||||||
)
|
)
|
||||||
assert reservation is not None
|
assert reservation is not None, "Reservation should be created"
|
||||||
assert reservation.num_children == 3
|
assert reservation.num_children == 3
|
||||||
# children_ages is stored as CSV string
|
# children_ages is stored as CSV string
|
||||||
children_ages = [
|
children_ages = [
|
||||||
@@ -631,9 +637,9 @@ class TestGenericWebhookEndpoint:
|
|||||||
),
|
),
|
||||||
None,
|
None,
|
||||||
)
|
)
|
||||||
assert customer is not None
|
assert customer is not None, "Customer should be created"
|
||||||
assert customer.phone is None # Empty phone should be None
|
assert customer.phone is None, "Empty phone should be None"
|
||||||
assert customer.name_prefix is None # -- should be filtered out
|
assert customer.name_prefix is None, "Name prefix '--' should be filtered out"
|
||||||
|
|
||||||
import asyncio
|
import asyncio
|
||||||
|
|
||||||
@@ -914,7 +920,7 @@ class TestErrorHandling:
|
|||||||
headers={"Content-Type": "application/json"},
|
headers={"Content-Type": "application/json"},
|
||||||
)
|
)
|
||||||
|
|
||||||
assert response.status_code == 422
|
assert response.status_code == 400
|
||||||
|
|
||||||
def test_wix_webhook_missing_required_fields(self, client):
|
def test_wix_webhook_missing_required_fields(self, client):
|
||||||
"""Test webhook with missing required fields."""
|
"""Test webhook with missing required fields."""
|
||||||
|
|||||||
215
tests/test_api_freerooms.py
Normal file
215
tests/test_api_freerooms.py
Normal file
@@ -0,0 +1,215 @@
|
|||||||
|
"""Integration tests for the FreeRooms endpoint."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import asyncio
|
||||||
|
import gzip
|
||||||
|
import urllib.parse
|
||||||
|
from datetime import UTC, datetime
|
||||||
|
from unittest.mock import patch
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
from fastapi.testclient import TestClient
|
||||||
|
from sqlalchemy import select
|
||||||
|
from sqlalchemy.ext.asyncio import async_sessionmaker, create_async_engine
|
||||||
|
|
||||||
|
from alpine_bits_python.alpinebits_server import AlpineBitsServer
|
||||||
|
from alpine_bits_python.api import app
|
||||||
|
from alpine_bits_python.const import HttpStatusCode
|
||||||
|
from alpine_bits_python.db import Base, Hotel, RoomAvailability
|
||||||
|
|
||||||
|
|
||||||
|
def build_request_xml(body: str, include_unique_id: bool = True) -> str:
|
||||||
|
unique = (
|
||||||
|
'<UniqueID Type="16" ID="1" Instance="CompleteSet"/>'
|
||||||
|
if include_unique_id
|
||||||
|
else ""
|
||||||
|
)
|
||||||
|
return f"""<?xml version="1.0" encoding="UTF-8"?>
|
||||||
|
<OTA_HotelInvCountNotifRQ xmlns="http://www.opentravel.org/OTA/2003/05" Version="7.000">
|
||||||
|
{unique}
|
||||||
|
<Inventories HotelCode="HOTEL123" HotelName="Integration Hotel">
|
||||||
|
{body}
|
||||||
|
</Inventories>
|
||||||
|
</OTA_HotelInvCountNotifRQ>"""
|
||||||
|
|
||||||
|
|
||||||
|
INVENTORY_A = """
|
||||||
|
<Inventory>
|
||||||
|
<StatusApplicationControl Start="2025-10-01" End="2025-10-03" InvTypeCode="DBL"/>
|
||||||
|
<InvCounts>
|
||||||
|
<InvCount CountType="2" Count="3"/>
|
||||||
|
</InvCounts>
|
||||||
|
</Inventory>
|
||||||
|
"""
|
||||||
|
|
||||||
|
INVENTORY_B = """
|
||||||
|
<Inventory>
|
||||||
|
<StatusApplicationControl Start="2025-10-02" End="2025-10-02" InvTypeCode="DBL"/>
|
||||||
|
<InvCounts>
|
||||||
|
<InvCount CountType="2" Count="1"/>
|
||||||
|
</InvCounts>
|
||||||
|
</Inventory>
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def freerooms_test_config():
|
||||||
|
return {
|
||||||
|
"server": {
|
||||||
|
"codecontext": "ADVERTISING",
|
||||||
|
"code": "70597314",
|
||||||
|
"companyname": "99tales Gmbh",
|
||||||
|
"res_id_source_context": "99tales",
|
||||||
|
},
|
||||||
|
"alpine_bits_auth": [
|
||||||
|
{
|
||||||
|
"hotel_id": "HOTEL123",
|
||||||
|
"hotel_name": "Integration Hotel",
|
||||||
|
"username": "testuser",
|
||||||
|
"password": "testpass",
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"database": {"url": "sqlite+aiosqlite:///:memory:"},
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def freerooms_client(freerooms_test_config):
|
||||||
|
engine = create_async_engine("sqlite+aiosqlite:///:memory:", echo=False)
|
||||||
|
|
||||||
|
async def create_tables():
|
||||||
|
async with engine.begin() as conn:
|
||||||
|
await conn.run_sync(Base.metadata.create_all)
|
||||||
|
|
||||||
|
asyncio.run(create_tables())
|
||||||
|
|
||||||
|
with patch("alpine_bits_python.api.load_config", return_value=freerooms_test_config), patch(
|
||||||
|
"alpine_bits_python.api.create_database_engine", return_value=engine
|
||||||
|
):
|
||||||
|
app.state.engine = engine
|
||||||
|
app.state.async_sessionmaker = async_sessionmaker(engine, expire_on_commit=False)
|
||||||
|
app.state.config = freerooms_test_config
|
||||||
|
app.state.alpine_bits_server = AlpineBitsServer(freerooms_test_config)
|
||||||
|
|
||||||
|
with TestClient(app) as test_client:
|
||||||
|
yield test_client
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def freerooms_headers():
|
||||||
|
return {
|
||||||
|
"Authorization": "Basic dGVzdHVzZXI6dGVzdHBhc3M=",
|
||||||
|
"X-AlpineBits-ClientProtocolVersion": "2024-10",
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def seed_hotel_if_missing(client: TestClient):
|
||||||
|
async def _seed():
|
||||||
|
async_sessionmaker = client.app.state.async_sessionmaker
|
||||||
|
async with async_sessionmaker() as session:
|
||||||
|
result = await session.execute(
|
||||||
|
select(Hotel).where(Hotel.hotel_id == "HOTEL123")
|
||||||
|
)
|
||||||
|
if result.scalar_one_or_none():
|
||||||
|
return
|
||||||
|
session.add(
|
||||||
|
Hotel(
|
||||||
|
hotel_id="HOTEL123",
|
||||||
|
hotel_name="Integration Hotel",
|
||||||
|
username="testuser",
|
||||||
|
password_hash="integration-hash",
|
||||||
|
created_at=datetime.now(UTC),
|
||||||
|
updated_at=datetime.now(UTC),
|
||||||
|
is_active=True,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
await session.commit()
|
||||||
|
|
||||||
|
asyncio.run(_seed())
|
||||||
|
|
||||||
|
|
||||||
|
def fetch_availability(client: TestClient):
|
||||||
|
async def _fetch():
|
||||||
|
async_sessionmaker = client.app.state.async_sessionmaker
|
||||||
|
async with async_sessionmaker() as session:
|
||||||
|
result = await session.execute(
|
||||||
|
select(RoomAvailability).order_by(RoomAvailability.date)
|
||||||
|
)
|
||||||
|
return result.scalars().all()
|
||||||
|
|
||||||
|
return asyncio.run(_fetch())
|
||||||
|
|
||||||
|
|
||||||
|
def test_freerooms_endpoint_complete_set(freerooms_client: TestClient, freerooms_headers):
|
||||||
|
seed_hotel_if_missing(freerooms_client)
|
||||||
|
xml = build_request_xml(INVENTORY_A, include_unique_id=True)
|
||||||
|
|
||||||
|
response = freerooms_client.post(
|
||||||
|
"/api/alpinebits/server-2024-10",
|
||||||
|
data={"action": "OTA_HotelInvCountNotif:FreeRooms", "request": xml},
|
||||||
|
headers=freerooms_headers,
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == HttpStatusCode.OK
|
||||||
|
assert "<Success" in response.text
|
||||||
|
|
||||||
|
rows = fetch_availability(freerooms_client)
|
||||||
|
assert len(rows) == 3
|
||||||
|
assert rows[0].count_type_2 == 3
|
||||||
|
|
||||||
|
|
||||||
|
def test_freerooms_endpoint_delta_updates_existing_rows(
|
||||||
|
freerooms_client: TestClient, freerooms_headers
|
||||||
|
):
|
||||||
|
seed_hotel_if_missing(freerooms_client)
|
||||||
|
complete_xml = build_request_xml(INVENTORY_A, include_unique_id=True)
|
||||||
|
delta_xml = build_request_xml(INVENTORY_B, include_unique_id=False)
|
||||||
|
|
||||||
|
response = freerooms_client.post(
|
||||||
|
"/api/alpinebits/server-2024-10",
|
||||||
|
data={"action": "OTA_HotelInvCountNotif:FreeRooms", "request": complete_xml},
|
||||||
|
headers=freerooms_headers,
|
||||||
|
)
|
||||||
|
assert response.status_code == HttpStatusCode.OK
|
||||||
|
|
||||||
|
response = freerooms_client.post(
|
||||||
|
"/api/alpinebits/server-2024-10",
|
||||||
|
data={"action": "OTA_HotelInvCountNotif:FreeRooms", "request": delta_xml},
|
||||||
|
headers=freerooms_headers,
|
||||||
|
)
|
||||||
|
assert response.status_code == HttpStatusCode.OK
|
||||||
|
|
||||||
|
rows = fetch_availability(freerooms_client)
|
||||||
|
counts = {row.date.isoformat(): row.count_type_2 for row in rows}
|
||||||
|
assert counts["2025-10-02"] == 1
|
||||||
|
assert counts["2025-10-01"] == 3
|
||||||
|
|
||||||
|
|
||||||
|
def test_freerooms_endpoint_accepts_gzip_payload(
|
||||||
|
freerooms_client: TestClient, freerooms_headers
|
||||||
|
):
|
||||||
|
seed_hotel_if_missing(freerooms_client)
|
||||||
|
xml = build_request_xml(INVENTORY_A, include_unique_id=True)
|
||||||
|
encoded = urllib.parse.urlencode(
|
||||||
|
{"action": "OTA_HotelInvCountNotif:FreeRooms", "request": xml}
|
||||||
|
).encode("utf-8")
|
||||||
|
compressed = gzip.compress(encoded)
|
||||||
|
|
||||||
|
headers = {
|
||||||
|
**freerooms_headers,
|
||||||
|
"Content-Encoding": "gzip",
|
||||||
|
"Content-Type": "application/x-www-form-urlencoded",
|
||||||
|
}
|
||||||
|
|
||||||
|
response = freerooms_client.post(
|
||||||
|
"/api/alpinebits/server-2024-10",
|
||||||
|
data=compressed,
|
||||||
|
headers=headers,
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == HttpStatusCode.OK
|
||||||
|
assert "<Success" in response.text
|
||||||
|
|
||||||
|
rows = fetch_availability(freerooms_client)
|
||||||
|
assert len(rows) == 3
|
||||||
@@ -340,13 +340,348 @@ class TestConversionServiceWithImportedData:
|
|||||||
assert stats["total_daily_sales"] == 0
|
assert stats["total_daily_sales"] == 0
|
||||||
assert stats["errors"] == 0
|
assert stats["errors"] == 0
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_duplicate_reservations(self, test_db_session):
|
||||||
|
"""Test that room entries are correctly updated when reservation status changes.
|
||||||
|
|
||||||
|
This test detects a bug where ConversionRoom records are not properly upserted
|
||||||
|
when the same reservation is processed multiple times with different room numbers.
|
||||||
|
|
||||||
|
Scenario:
|
||||||
|
1. Process reservation with status='request', no revenue, room_number='101'
|
||||||
|
2. Process reservation with status='reservation', with revenue, room_number='102'
|
||||||
|
3. Swap: Process same reservations but reversed - first one now has status='reservation'
|
||||||
|
with room_number='201', second has status='request' with room_number='202'
|
||||||
|
4. The old room entries (101, 102) should no longer exist in the database
|
||||||
|
"""
|
||||||
|
from tests.helpers import ReservationXMLBuilder, MultiReservationXMLBuilder
|
||||||
|
|
||||||
|
# First batch: Process two reservations
|
||||||
|
multi_builder1 = MultiReservationXMLBuilder()
|
||||||
|
|
||||||
|
# Reservation 1: Request status, no revenue, room 101
|
||||||
|
res1_v1 = (
|
||||||
|
ReservationXMLBuilder(
|
||||||
|
hotel_id="39054_001",
|
||||||
|
reservation_id="res_001",
|
||||||
|
reservation_number="RES-001",
|
||||||
|
reservation_date="2025-11-14",
|
||||||
|
reservation_type="request",
|
||||||
|
)
|
||||||
|
.set_guest(
|
||||||
|
guest_id="guest_001",
|
||||||
|
first_name="Alice",
|
||||||
|
last_name="Johnson",
|
||||||
|
email="alice@example.com",
|
||||||
|
)
|
||||||
|
.add_room(
|
||||||
|
arrival="2025-12-01",
|
||||||
|
departure="2025-12-03",
|
||||||
|
room_number="101",
|
||||||
|
status="request",
|
||||||
|
# No revenue
|
||||||
|
)
|
||||||
|
)
|
||||||
|
multi_builder1.add_reservation(res1_v1)
|
||||||
|
|
||||||
|
# Reservation 2: Reservation status, with revenue, room 102
|
||||||
|
res2_v1 = (
|
||||||
|
ReservationXMLBuilder(
|
||||||
|
hotel_id="39054_001",
|
||||||
|
reservation_id="res_002",
|
||||||
|
reservation_number="RES-002",
|
||||||
|
reservation_date="2025-11-15",
|
||||||
|
reservation_type="reservation",
|
||||||
|
)
|
||||||
|
.set_guest(
|
||||||
|
guest_id="guest_002",
|
||||||
|
first_name="Bob",
|
||||||
|
last_name="Smith",
|
||||||
|
email="bob@example.com",
|
||||||
|
)
|
||||||
|
.add_room(
|
||||||
|
arrival="2025-12-10",
|
||||||
|
departure="2025-12-12",
|
||||||
|
room_number="102",
|
||||||
|
status="reserved",
|
||||||
|
revenue_logis_per_day=150.0,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
multi_builder1.add_reservation(res2_v1)
|
||||||
|
|
||||||
|
xml_content1 = multi_builder1.build_xml()
|
||||||
|
|
||||||
|
# Process first batch
|
||||||
|
service = ConversionService(test_db_session)
|
||||||
|
stats1 = await service.process_conversion_xml(xml_content1)
|
||||||
|
|
||||||
|
assert stats1["total_reservations"] == 2
|
||||||
|
|
||||||
|
# Verify rooms exist in database
|
||||||
|
result = await test_db_session.execute(
|
||||||
|
select(ConversionRoom).where(ConversionRoom.room_number == "101")
|
||||||
|
)
|
||||||
|
room_101 = result.scalar_one_or_none()
|
||||||
|
assert room_101 is not None, "Room 101 should exist after first processing"
|
||||||
|
|
||||||
|
result = await test_db_session.execute(
|
||||||
|
select(ConversionRoom).where(ConversionRoom.room_number == "102")
|
||||||
|
)
|
||||||
|
room_102 = result.scalar_one_or_none()
|
||||||
|
assert room_102 is not None, "Room 102 should exist after first processing"
|
||||||
|
|
||||||
|
# Second batch: Swap the reservations and change room numbers
|
||||||
|
multi_builder2 = MultiReservationXMLBuilder()
|
||||||
|
|
||||||
|
# Reservation 1: NOW has reservation status, with revenue, room 201 (changed from 101)
|
||||||
|
res1_v2 = (
|
||||||
|
ReservationXMLBuilder(
|
||||||
|
hotel_id="39054_001",
|
||||||
|
reservation_id="res_001", # Same ID
|
||||||
|
reservation_number="RES-001", # Same number
|
||||||
|
reservation_date="2025-11-14",
|
||||||
|
reservation_type="reservation", # Changed from request
|
||||||
|
)
|
||||||
|
.set_guest(
|
||||||
|
guest_id="guest_001",
|
||||||
|
first_name="Alice",
|
||||||
|
last_name="Johnson",
|
||||||
|
email="alice@example.com",
|
||||||
|
)
|
||||||
|
.add_room(
|
||||||
|
arrival="2025-12-01",
|
||||||
|
departure="2025-12-03",
|
||||||
|
room_number="201", # Changed from 101
|
||||||
|
status="reserved",
|
||||||
|
revenue_logis_per_day=200.0, # Now has revenue
|
||||||
|
)
|
||||||
|
)
|
||||||
|
multi_builder2.add_reservation(res1_v2)
|
||||||
|
|
||||||
|
# Reservation 2: NOW has request status, no revenue, room 202 (changed from 102)
|
||||||
|
res2_v2 = (
|
||||||
|
ReservationXMLBuilder(
|
||||||
|
hotel_id="39054_001",
|
||||||
|
reservation_id="res_002", # Same ID
|
||||||
|
reservation_number="RES-002", # Same number
|
||||||
|
reservation_date="2025-11-15",
|
||||||
|
reservation_type="request", # Changed from reservation
|
||||||
|
)
|
||||||
|
.set_guest(
|
||||||
|
guest_id="guest_002",
|
||||||
|
first_name="Bob",
|
||||||
|
last_name="Smith",
|
||||||
|
email="bob@example.com",
|
||||||
|
)
|
||||||
|
.add_room(
|
||||||
|
arrival="2025-12-10",
|
||||||
|
departure="2025-12-12",
|
||||||
|
room_number="202", # Changed from 102
|
||||||
|
status="request",
|
||||||
|
# No revenue anymore
|
||||||
|
)
|
||||||
|
)
|
||||||
|
multi_builder2.add_reservation(res2_v2)
|
||||||
|
|
||||||
|
xml_content2 = multi_builder2.build_xml()
|
||||||
|
|
||||||
|
# Process second batch
|
||||||
|
stats2 = await service.process_conversion_xml(xml_content2)
|
||||||
|
|
||||||
|
assert stats2["total_reservations"] == 2
|
||||||
|
|
||||||
|
# BUG DETECTION: Old room entries (101, 102) should NOT exist anymore
|
||||||
|
# They should have been replaced by new room entries (201, 202)
|
||||||
|
|
||||||
|
result = await test_db_session.execute(
|
||||||
|
select(ConversionRoom).where(ConversionRoom.room_number == "101")
|
||||||
|
)
|
||||||
|
room_101_after = result.scalar_one_or_none()
|
||||||
|
assert room_101_after is None, (
|
||||||
|
"BUG: Room 101 should no longer exist after reprocessing with room 201. "
|
||||||
|
"Old room entries are not being removed when reservation is updated."
|
||||||
|
)
|
||||||
|
|
||||||
|
result = await test_db_session.execute(
|
||||||
|
select(ConversionRoom).where(ConversionRoom.room_number == "102")
|
||||||
|
)
|
||||||
|
room_102_after = result.scalar_one_or_none()
|
||||||
|
assert room_102_after is None, (
|
||||||
|
"BUG: Room 102 should no longer exist after reprocessing with room 202. "
|
||||||
|
"Old room entries are not being removed when reservation is updated."
|
||||||
|
)
|
||||||
|
|
||||||
|
# New room entries should exist
|
||||||
|
result = await test_db_session.execute(
|
||||||
|
select(ConversionRoom).where(ConversionRoom.room_number == "201")
|
||||||
|
)
|
||||||
|
room_201 = result.scalar_one_or_none()
|
||||||
|
assert room_201 is not None, "Room 201 should exist after second processing"
|
||||||
|
|
||||||
|
result = await test_db_session.execute(
|
||||||
|
select(ConversionRoom).where(ConversionRoom.room_number == "202")
|
||||||
|
)
|
||||||
|
room_202 = result.scalar_one_or_none()
|
||||||
|
assert room_202 is not None, "Room 202 should exist after second processing"
|
||||||
|
|
||||||
|
# Verify we only have 2 conversion room records total (not 4)
|
||||||
|
result = await test_db_session.execute(select(ConversionRoom))
|
||||||
|
all_rooms = result.scalars().all()
|
||||||
|
assert len(all_rooms) == 2, (
|
||||||
|
f"BUG: Expected 2 conversion rooms total, but found {len(all_rooms)}. "
|
||||||
|
f"Old room entries are not being deleted. Room numbers: {[r.room_number for r in all_rooms]}"
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
class TestXMLBuilderUsage:
|
||||||
|
"""Demonstrate usage of XML builder helpers for creating test data."""
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_using_xml_builder_for_simple_reservation(self, test_db_session):
|
||||||
|
"""Example: Create a simple reservation using the XML builder helper."""
|
||||||
|
from tests.helpers import ReservationXMLBuilder
|
||||||
|
|
||||||
|
# Build a reservation with convenient fluent API
|
||||||
|
xml_content = (
|
||||||
|
ReservationXMLBuilder(
|
||||||
|
hotel_id="39054_001",
|
||||||
|
reservation_id="test_123",
|
||||||
|
reservation_number="RES-123",
|
||||||
|
reservation_date="2025-11-14",
|
||||||
|
)
|
||||||
|
.set_guest(
|
||||||
|
guest_id="guest_001",
|
||||||
|
first_name="John",
|
||||||
|
last_name="Doe",
|
||||||
|
email="john@example.com",
|
||||||
|
country_code="US",
|
||||||
|
)
|
||||||
|
.add_room(
|
||||||
|
arrival="2025-12-01",
|
||||||
|
departure="2025-12-05",
|
||||||
|
room_type="DZV",
|
||||||
|
room_number="101",
|
||||||
|
revenue_logis_per_day=150.0,
|
||||||
|
adults=2
|
||||||
|
)
|
||||||
|
.build_xml()
|
||||||
|
)
|
||||||
|
|
||||||
|
# Process the XML
|
||||||
|
service = ConversionService(test_db_session)
|
||||||
|
stats = await service.process_conversion_xml(xml_content)
|
||||||
|
|
||||||
|
assert stats["total_reservations"] == 1
|
||||||
|
assert stats["total_daily_sales"] == 5 # 4 nights + departure day
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_using_xml_builder_for_multi_room_reservation(
|
||||||
|
self, test_db_session
|
||||||
|
):
|
||||||
|
"""Example: Create a reservation with multiple rooms."""
|
||||||
|
from tests.helpers import ReservationXMLBuilder
|
||||||
|
|
||||||
|
xml_content = (
|
||||||
|
ReservationXMLBuilder(
|
||||||
|
hotel_id="39054_001",
|
||||||
|
reservation_id="test_456",
|
||||||
|
reservation_number="RES-456",
|
||||||
|
reservation_date="2025-11-14",
|
||||||
|
)
|
||||||
|
.set_guest(
|
||||||
|
guest_id="guest_002",
|
||||||
|
first_name="Jane",
|
||||||
|
last_name="Smith",
|
||||||
|
email="jane@example.com",
|
||||||
|
)
|
||||||
|
.add_room(
|
||||||
|
arrival="2025-12-01",
|
||||||
|
departure="2025-12-05",
|
||||||
|
room_number="101",
|
||||||
|
revenue_logis_per_day=150.0,
|
||||||
|
)
|
||||||
|
.add_room(
|
||||||
|
arrival="2025-12-01",
|
||||||
|
departure="2025-12-05",
|
||||||
|
room_number="102",
|
||||||
|
revenue_logis_per_day=200.0,
|
||||||
|
)
|
||||||
|
.build_xml()
|
||||||
|
)
|
||||||
|
|
||||||
|
service = ConversionService(test_db_session)
|
||||||
|
stats = await service.process_conversion_xml(xml_content)
|
||||||
|
|
||||||
|
assert stats["total_reservations"] == 1
|
||||||
|
# 2 rooms × 5 daily sales each = 10 total
|
||||||
|
assert stats["total_daily_sales"] == 10
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_using_multi_reservation_builder(self, test_db_session):
|
||||||
|
"""Example: Create multiple reservations in one XML document."""
|
||||||
|
from tests.helpers import ReservationXMLBuilder, MultiReservationXMLBuilder
|
||||||
|
|
||||||
|
multi_builder = MultiReservationXMLBuilder()
|
||||||
|
|
||||||
|
# Add first reservation
|
||||||
|
res1 = (
|
||||||
|
ReservationXMLBuilder(
|
||||||
|
hotel_id="39054_001",
|
||||||
|
reservation_id="test_001",
|
||||||
|
reservation_number="RES-001",
|
||||||
|
reservation_date="2025-11-14",
|
||||||
|
)
|
||||||
|
.set_guest(
|
||||||
|
guest_id="guest_001",
|
||||||
|
first_name="Alice",
|
||||||
|
last_name="Johnson",
|
||||||
|
email="alice@example.com",
|
||||||
|
)
|
||||||
|
.add_room(
|
||||||
|
arrival="2025-12-01",
|
||||||
|
departure="2025-12-03",
|
||||||
|
revenue_logis_per_day=100.0,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
multi_builder.add_reservation(res1)
|
||||||
|
|
||||||
|
# Add second reservation
|
||||||
|
res2 = (
|
||||||
|
ReservationXMLBuilder(
|
||||||
|
hotel_id="39054_001",
|
||||||
|
reservation_id="test_002",
|
||||||
|
reservation_number="RES-002",
|
||||||
|
reservation_date="2025-11-15",
|
||||||
|
)
|
||||||
|
.set_guest(
|
||||||
|
guest_id="guest_002",
|
||||||
|
first_name="Bob",
|
||||||
|
last_name="Williams",
|
||||||
|
email="bob@example.com",
|
||||||
|
)
|
||||||
|
.add_room(
|
||||||
|
arrival="2025-12-10",
|
||||||
|
departure="2025-12-12",
|
||||||
|
revenue_logis_per_day=150.0,
|
||||||
|
)
|
||||||
|
)
|
||||||
|
multi_builder.add_reservation(res2)
|
||||||
|
|
||||||
|
xml_content = multi_builder.build_xml()
|
||||||
|
|
||||||
|
# Process the XML
|
||||||
|
service = ConversionService(test_db_session)
|
||||||
|
stats = await service.process_conversion_xml(xml_content)
|
||||||
|
|
||||||
|
assert stats["total_reservations"] == 2
|
||||||
|
# Res1: 3 days (2 nights), Res2: 3 days (2 nights) = 6 total
|
||||||
|
assert stats["total_daily_sales"] == 6
|
||||||
|
|
||||||
|
|
||||||
class TestHashedMatchingLogic:
|
class TestHashedMatchingLogic:
|
||||||
"""Test the hashed matching logic used in ConversionService."""
|
"""Test the hashed matching logic used in ConversionService."""
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
@pytest.mark.asyncio
|
@pytest.mark.asyncio
|
||||||
async def test_conversion_guest_hashed_fields_are_populated(
|
async def test_conversion_guest_hashed_fields_are_populated(
|
||||||
self, test_db_session
|
self, test_db_session
|
||||||
|
|||||||
@@ -124,7 +124,7 @@
|
|||||||
},
|
},
|
||||||
{
|
{
|
||||||
"label": "hotelid",
|
"label": "hotelid",
|
||||||
"value": "39054_001"
|
"value": "135"
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
"label": "hotelname",
|
"label": "hotelname",
|
||||||
@@ -260,7 +260,7 @@
|
|||||||
"field:angebot_auswaehlen": "Zimmer: Doppelzimmer",
|
"field:angebot_auswaehlen": "Zimmer: Doppelzimmer",
|
||||||
"field:utm_content": "",
|
"field:utm_content": "",
|
||||||
"field:last_name_d97c": "Pohl",
|
"field:last_name_d97c": "Pohl",
|
||||||
"field:hotelid": "39054_001",
|
"field:hotelid": "135",
|
||||||
"submissionsLink": "https://manage.wix.app/forms/submissions/1dea821c-8168-4736-96e4-4b92e8b364cf/e084006b-ae83-4e4d-b2f5-074118cdb3b1?d=https%3A%2F%2Fmanage.wix.com%2Fdashboard%2F1dea821c-8168-4736-96e4-4b92e8b364cf%2Fwix-forms%2Fform%2Fe084006b-ae83-4e4d-b2f5-074118cdb3b1%2Fsubmissions&s=true",
|
"submissionsLink": "https://manage.wix.app/forms/submissions/1dea821c-8168-4736-96e4-4b92e8b364cf/e084006b-ae83-4e4d-b2f5-074118cdb3b1?d=https%3A%2F%2Fmanage.wix.com%2Fdashboard%2F1dea821c-8168-4736-96e4-4b92e8b364cf%2Fwix-forms%2Fform%2Fe084006b-ae83-4e4d-b2f5-074118cdb3b1%2Fsubmissions&s=true",
|
||||||
"field:gbraid": "0AAAAADxR52Ad0oCzeogeTrupgGeMwD7Yp",
|
"field:gbraid": "0AAAAADxR52Ad0oCzeogeTrupgGeMwD7Yp",
|
||||||
"field:fbclid": "",
|
"field:fbclid": "",
|
||||||
|
|||||||
367
tests/test_free_rooms_action.py
Normal file
367
tests/test_free_rooms_action.py
Normal file
@@ -0,0 +1,367 @@
|
|||||||
|
"""Unit tests for FreeRoomsAction."""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
from datetime import UTC, datetime
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
import pytest_asyncio
|
||||||
|
from sqlalchemy import select
|
||||||
|
from sqlalchemy.ext.asyncio import AsyncSession, async_sessionmaker, create_async_engine
|
||||||
|
|
||||||
|
from alpine_bits_python.alpinebits_server import AlpineBitsClientInfo, Version
|
||||||
|
from alpine_bits_python.const import HttpStatusCode
|
||||||
|
from alpine_bits_python.db import Base, Hotel, HotelInventory, RoomAvailability
|
||||||
|
from alpine_bits_python.free_rooms_action import FreeRoomsAction
|
||||||
|
|
||||||
|
|
||||||
|
TEST_CONFIG = {
|
||||||
|
"alpine_bits_auth": [
|
||||||
|
{
|
||||||
|
"hotel_id": "TESTHOTEL",
|
||||||
|
"hotel_name": "Unit Test Hotel",
|
||||||
|
"username": "testuser",
|
||||||
|
"password": "testpass",
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def build_complete_set_xml(body: str, hotel_code: str = "TESTHOTEL") -> str:
|
||||||
|
return f"""<?xml version="1.0" encoding="UTF-8"?>
|
||||||
|
<OTA_HotelInvCountNotifRQ xmlns="http://www.opentravel.org/OTA/2003/05" Version="7.000">
|
||||||
|
<UniqueID Type="16" ID="1" Instance="CompleteSet"/>
|
||||||
|
<Inventories HotelCode="{hotel_code}" HotelName="Unit Hotel">
|
||||||
|
{body}
|
||||||
|
</Inventories>
|
||||||
|
</OTA_HotelInvCountNotifRQ>"""
|
||||||
|
|
||||||
|
|
||||||
|
def build_delta_xml(body: str, hotel_code: str = "TESTHOTEL") -> str:
|
||||||
|
return f"""<?xml version="1.0" encoding="UTF-8"?>
|
||||||
|
<OTA_HotelInvCountNotifRQ xmlns="http://www.opentravel.org/OTA/2003/05" Version="7.000">
|
||||||
|
<Inventories HotelCode="{hotel_code}" HotelName="Unit Hotel">
|
||||||
|
{body}
|
||||||
|
</Inventories>
|
||||||
|
</OTA_HotelInvCountNotifRQ>"""
|
||||||
|
|
||||||
|
|
||||||
|
def daily_inventory(start: str, end: str, inv_type: str = "DBL", count: int = 3) -> str:
|
||||||
|
return f"""
|
||||||
|
<Inventory>
|
||||||
|
<StatusApplicationControl Start="{start}" End="{end}" InvTypeCode="{inv_type}"/>
|
||||||
|
<InvCounts>
|
||||||
|
<InvCount CountType="2" Count="{count}"/>
|
||||||
|
</InvCounts>
|
||||||
|
</Inventory>
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
@pytest_asyncio.fixture
|
||||||
|
async def db_engine():
|
||||||
|
engine = create_async_engine("sqlite+aiosqlite:///:memory:", echo=False)
|
||||||
|
async with engine.begin() as conn:
|
||||||
|
await conn.run_sync(Base.metadata.create_all)
|
||||||
|
yield engine
|
||||||
|
await engine.dispose()
|
||||||
|
|
||||||
|
|
||||||
|
@pytest_asyncio.fixture
|
||||||
|
async def db_session(db_engine):
|
||||||
|
session_factory = async_sessionmaker(db_engine, expire_on_commit=False, class_=AsyncSession)
|
||||||
|
async with session_factory() as session:
|
||||||
|
yield session
|
||||||
|
|
||||||
|
|
||||||
|
async def insert_test_hotel(session: AsyncSession, hotel_id: str = "TESTHOTEL"):
|
||||||
|
hotel = Hotel(
|
||||||
|
hotel_id=hotel_id,
|
||||||
|
hotel_name="Unit Test Hotel",
|
||||||
|
username="testuser",
|
||||||
|
password_hash="bcrypt-hash",
|
||||||
|
created_at=datetime.now(UTC),
|
||||||
|
updated_at=datetime.now(UTC),
|
||||||
|
is_active=True,
|
||||||
|
)
|
||||||
|
session.add(hotel)
|
||||||
|
await session.commit()
|
||||||
|
return hotel
|
||||||
|
|
||||||
|
|
||||||
|
def make_action() -> FreeRoomsAction:
|
||||||
|
return FreeRoomsAction(config=TEST_CONFIG)
|
||||||
|
|
||||||
|
|
||||||
|
def make_client_info() -> AlpineBitsClientInfo:
|
||||||
|
return AlpineBitsClientInfo(username="testuser", password="testpass")
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_complete_set_creates_inventory_and_availability(db_session: AsyncSession):
|
||||||
|
await insert_test_hotel(db_session)
|
||||||
|
action = make_action()
|
||||||
|
|
||||||
|
xml = build_complete_set_xml(
|
||||||
|
daily_inventory("2025-01-01", "2025-01-03", inv_type="DBL", count=4)
|
||||||
|
)
|
||||||
|
|
||||||
|
response = await action.handle(
|
||||||
|
action="OTA_HotelInvCountNotif:FreeRooms",
|
||||||
|
request_xml=xml,
|
||||||
|
version=Version.V2024_10,
|
||||||
|
client_info=make_client_info(),
|
||||||
|
dbsession=db_session,
|
||||||
|
)
|
||||||
|
|
||||||
|
assert response.status_code == HttpStatusCode.OK
|
||||||
|
inventories = (await db_session.execute(select(HotelInventory))).scalars().all()
|
||||||
|
assert len(inventories) == 1
|
||||||
|
assert inventories[0].inv_type_code == "DBL"
|
||||||
|
rows = (
|
||||||
|
await db_session.execute(
|
||||||
|
select(RoomAvailability).order_by(RoomAvailability.date)
|
||||||
|
)
|
||||||
|
).scalars().all()
|
||||||
|
assert len(rows) == 3
|
||||||
|
assert rows[0].count_type_2 == 4
|
||||||
|
assert rows[0].update_type == "CompleteSet"
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_complete_set_replaces_previous_availability(db_session: AsyncSession):
|
||||||
|
await insert_test_hotel(db_session)
|
||||||
|
action = make_action()
|
||||||
|
|
||||||
|
xml_initial = build_complete_set_xml(daily_inventory("2025-02-01", "2025-02-02", count=5))
|
||||||
|
xml_updated = build_complete_set_xml(daily_inventory("2025-02-01", "2025-02-01", count=1))
|
||||||
|
|
||||||
|
await action.handle(
|
||||||
|
"OTA_HotelInvCountNotif:FreeRooms",
|
||||||
|
xml_initial,
|
||||||
|
Version.V2024_10,
|
||||||
|
make_client_info(),
|
||||||
|
db_session,
|
||||||
|
)
|
||||||
|
|
||||||
|
await action.handle(
|
||||||
|
"OTA_HotelInvCountNotif:FreeRooms",
|
||||||
|
xml_updated,
|
||||||
|
Version.V2024_10,
|
||||||
|
make_client_info(),
|
||||||
|
db_session,
|
||||||
|
)
|
||||||
|
|
||||||
|
rows = (
|
||||||
|
await db_session.execute(select(RoomAvailability).order_by(RoomAvailability.date))
|
||||||
|
).scalars().all()
|
||||||
|
assert len(rows) == 1
|
||||||
|
assert rows[0].date.isoformat() == "2025-02-01"
|
||||||
|
assert rows[0].count_type_2 == 1
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_delta_updates_only_specified_dates(db_session: AsyncSession):
|
||||||
|
await insert_test_hotel(db_session)
|
||||||
|
action = make_action()
|
||||||
|
|
||||||
|
complete_xml = build_complete_set_xml(daily_inventory("2025-03-01", "2025-03-03", count=2))
|
||||||
|
delta_xml = build_delta_xml(daily_inventory("2025-03-02", "2025-03-02", count=7))
|
||||||
|
|
||||||
|
await action.handle(
|
||||||
|
"OTA_HotelInvCountNotif:FreeRooms",
|
||||||
|
complete_xml,
|
||||||
|
Version.V2024_10,
|
||||||
|
make_client_info(),
|
||||||
|
db_session,
|
||||||
|
)
|
||||||
|
await action.handle(
|
||||||
|
"OTA_HotelInvCountNotif:FreeRooms",
|
||||||
|
delta_xml,
|
||||||
|
Version.V2024_10,
|
||||||
|
make_client_info(),
|
||||||
|
db_session,
|
||||||
|
)
|
||||||
|
|
||||||
|
rows = (
|
||||||
|
await db_session.execute(select(RoomAvailability).order_by(RoomAvailability.date))
|
||||||
|
).scalars().all()
|
||||||
|
counts = {row.date.isoformat(): row.count_type_2 for row in rows}
|
||||||
|
assert counts == {
|
||||||
|
"2025-03-01": 2,
|
||||||
|
"2025-03-02": 7,
|
||||||
|
"2025-03-03": 2,
|
||||||
|
}
|
||||||
|
assert all(row.update_type in {"CompleteSet", "Delta"} for row in rows)
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_closing_season_entries_marked_correctly(db_session: AsyncSession):
|
||||||
|
await insert_test_hotel(db_session)
|
||||||
|
action = make_action()
|
||||||
|
|
||||||
|
xml = build_complete_set_xml(
|
||||||
|
"""
|
||||||
|
<Inventory>
|
||||||
|
<StatusApplicationControl Start="2025-04-01" End="2025-04-02" AllInvCode="true"/>
|
||||||
|
</Inventory>
|
||||||
|
<Inventory>
|
||||||
|
<StatusApplicationControl Start="2025-04-03" End="2025-04-03" InvTypeCode="SGL"/>
|
||||||
|
</Inventory>
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
|
||||||
|
response = await action.handle(
|
||||||
|
"OTA_HotelInvCountNotif:FreeRooms",
|
||||||
|
xml,
|
||||||
|
Version.V2024_10,
|
||||||
|
make_client_info(),
|
||||||
|
db_session,
|
||||||
|
)
|
||||||
|
assert response.status_code == HttpStatusCode.OK
|
||||||
|
|
||||||
|
inventories = (await db_session.execute(select(HotelInventory))).scalars().all()
|
||||||
|
closing_inventory = next(inv for inv in inventories if inv.inv_type_code == "__CLOSE")
|
||||||
|
assert closing_inventory.inv_code is None
|
||||||
|
|
||||||
|
rows = (
|
||||||
|
await db_session.execute(select(RoomAvailability).order_by(RoomAvailability.date))
|
||||||
|
).scalars().all()
|
||||||
|
closing_rows = [row for row in rows if row.is_closing_season]
|
||||||
|
assert len(closing_rows) == 2
|
||||||
|
assert all(row.count_type_2 is None for row in closing_rows)
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_closing_season_not_allowed_in_delta(db_session: AsyncSession):
|
||||||
|
await insert_test_hotel(db_session)
|
||||||
|
action = make_action()
|
||||||
|
|
||||||
|
xml = build_delta_xml(
|
||||||
|
"""
|
||||||
|
<Inventory>
|
||||||
|
<StatusApplicationControl Start="2025-05-01" End="2025-05-02" AllInvCode="true"/>
|
||||||
|
</Inventory>
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
|
||||||
|
response = await action.handle(
|
||||||
|
"OTA_HotelInvCountNotif:FreeRooms",
|
||||||
|
xml,
|
||||||
|
Version.V2024_10,
|
||||||
|
make_client_info(),
|
||||||
|
db_session,
|
||||||
|
)
|
||||||
|
assert response.status_code == HttpStatusCode.BAD_REQUEST
|
||||||
|
assert "Closing seasons" in response.xml_content
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_missing_invtypecode_returns_error(db_session: AsyncSession):
|
||||||
|
await insert_test_hotel(db_session)
|
||||||
|
action = make_action()
|
||||||
|
|
||||||
|
xml = build_complete_set_xml(
|
||||||
|
"""
|
||||||
|
<Inventory>
|
||||||
|
<StatusApplicationControl Start="2025-06-01" End="2025-06-02"/>
|
||||||
|
</Inventory>
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
|
||||||
|
response = await action.handle(
|
||||||
|
"OTA_HotelInvCountNotif:FreeRooms",
|
||||||
|
xml,
|
||||||
|
Version.V2024_10,
|
||||||
|
make_client_info(),
|
||||||
|
db_session,
|
||||||
|
)
|
||||||
|
assert response.status_code == HttpStatusCode.BAD_REQUEST
|
||||||
|
assert "InvTypeCode is required" in response.xml_content
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_duplicate_count_type_rejected(db_session: AsyncSession):
|
||||||
|
await insert_test_hotel(db_session)
|
||||||
|
action = make_action()
|
||||||
|
|
||||||
|
xml = build_complete_set_xml(
|
||||||
|
"""
|
||||||
|
<Inventory>
|
||||||
|
<StatusApplicationControl Start="2025-07-01" End="2025-07-01" InvTypeCode="SGL"/>
|
||||||
|
<InvCounts>
|
||||||
|
<InvCount CountType="2" Count="3"/>
|
||||||
|
<InvCount CountType="2" Count="4"/>
|
||||||
|
</InvCounts>
|
||||||
|
</Inventory>
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
|
||||||
|
response = await action.handle(
|
||||||
|
"OTA_HotelInvCountNotif:FreeRooms",
|
||||||
|
xml,
|
||||||
|
Version.V2024_10,
|
||||||
|
make_client_info(),
|
||||||
|
db_session,
|
||||||
|
)
|
||||||
|
assert response.status_code == HttpStatusCode.BAD_REQUEST
|
||||||
|
assert "Duplicate CountType" in response.xml_content
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_invalid_date_range_returns_error(db_session: AsyncSession):
|
||||||
|
await insert_test_hotel(db_session)
|
||||||
|
action = make_action()
|
||||||
|
|
||||||
|
xml = build_complete_set_xml(
|
||||||
|
"""
|
||||||
|
<Inventory>
|
||||||
|
<StatusApplicationControl Start="2025-08-10" End="2025-08-01" InvTypeCode="DBL"/>
|
||||||
|
</Inventory>
|
||||||
|
"""
|
||||||
|
)
|
||||||
|
|
||||||
|
response = await action.handle(
|
||||||
|
"OTA_HotelInvCountNotif:FreeRooms",
|
||||||
|
xml,
|
||||||
|
Version.V2024_10,
|
||||||
|
make_client_info(),
|
||||||
|
db_session,
|
||||||
|
)
|
||||||
|
assert response.status_code == HttpStatusCode.BAD_REQUEST
|
||||||
|
assert "End date cannot be before Start date" in response.xml_content
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_invalid_credentials_return_unauthorized(db_session: AsyncSession):
|
||||||
|
await insert_test_hotel(db_session)
|
||||||
|
action = make_action()
|
||||||
|
bad_client = AlpineBitsClientInfo(username="testuser", password="wrongpass")
|
||||||
|
|
||||||
|
xml = build_complete_set_xml(daily_inventory("2025-09-01", "2025-09-01"))
|
||||||
|
|
||||||
|
response = await action.handle(
|
||||||
|
"OTA_HotelInvCountNotif:FreeRooms",
|
||||||
|
xml,
|
||||||
|
Version.V2024_10,
|
||||||
|
bad_client,
|
||||||
|
db_session,
|
||||||
|
)
|
||||||
|
assert response.status_code == HttpStatusCode.UNAUTHORIZED
|
||||||
|
assert "Unauthorized" in response.xml_content
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_invalid_xml_returns_error(db_session: AsyncSession):
|
||||||
|
await insert_test_hotel(db_session)
|
||||||
|
action = make_action()
|
||||||
|
client_info = make_client_info()
|
||||||
|
|
||||||
|
response = await action.handle(
|
||||||
|
"OTA_HotelInvCountNotif:FreeRooms",
|
||||||
|
"<invalid",
|
||||||
|
Version.V2024_10,
|
||||||
|
client_info,
|
||||||
|
db_session,
|
||||||
|
)
|
||||||
|
assert response.status_code == HttpStatusCode.BAD_REQUEST
|
||||||
|
assert "Invalid XML payload" in response.xml_content
|
||||||
218
tests/test_schemas_webhook.py
Normal file
218
tests/test_schemas_webhook.py
Normal file
@@ -0,0 +1,218 @@
|
|||||||
|
"""Tests for webhook-related Pydantic schemas."""
|
||||||
|
|
||||||
|
import hashlib
|
||||||
|
import json
|
||||||
|
from datetime import datetime
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
from pydantic import ValidationError
|
||||||
|
|
||||||
|
from alpine_bits_python.const import WebhookStatus
|
||||||
|
from alpine_bits_python.schemas import (
|
||||||
|
HotelData,
|
||||||
|
WebhookEndpointData,
|
||||||
|
WebhookRequestData,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class TestHotelData:
|
||||||
|
"""Tests for HotelData schema."""
|
||||||
|
|
||||||
|
def test_valid_hotel_data(self):
|
||||||
|
"""Test creating a valid HotelData instance."""
|
||||||
|
data = HotelData(
|
||||||
|
hotel_id="hotel123",
|
||||||
|
hotel_name="Test Hotel",
|
||||||
|
username="admin",
|
||||||
|
password_hash="hashed_password_123",
|
||||||
|
)
|
||||||
|
assert data.hotel_id == "hotel123"
|
||||||
|
assert data.hotel_name == "Test Hotel"
|
||||||
|
assert data.username == "admin"
|
||||||
|
assert data.password_hash == "hashed_password_123"
|
||||||
|
assert data.is_active is True
|
||||||
|
assert isinstance(data.created_at, datetime)
|
||||||
|
|
||||||
|
def test_whitespace_stripping(self):
|
||||||
|
"""Test that whitespace is stripped from string fields."""
|
||||||
|
data = HotelData(
|
||||||
|
hotel_id=" hotel123 ",
|
||||||
|
hotel_name=" Test Hotel ",
|
||||||
|
username=" admin ",
|
||||||
|
password_hash="hashed_password_123",
|
||||||
|
)
|
||||||
|
assert data.hotel_id == "hotel123"
|
||||||
|
assert data.hotel_name == "Test Hotel"
|
||||||
|
assert data.username == "admin"
|
||||||
|
|
||||||
|
def test_optional_fields(self):
|
||||||
|
"""Test that optional fields can be None."""
|
||||||
|
data = HotelData(
|
||||||
|
hotel_id="hotel123",
|
||||||
|
hotel_name="Test Hotel",
|
||||||
|
username="admin",
|
||||||
|
password_hash="hashed_password_123",
|
||||||
|
meta_account_id=None,
|
||||||
|
google_account_id=None,
|
||||||
|
)
|
||||||
|
assert data.meta_account_id is None
|
||||||
|
assert data.google_account_id is None
|
||||||
|
|
||||||
|
|
||||||
|
class TestWebhookEndpointData:
|
||||||
|
"""Tests for WebhookEndpointData schema."""
|
||||||
|
|
||||||
|
def test_valid_webhook_endpoint(self):
|
||||||
|
"""Test creating a valid WebhookEndpointData instance."""
|
||||||
|
data = WebhookEndpointData(
|
||||||
|
hotel_id="hotel123",
|
||||||
|
webhook_secret="secret_abc123",
|
||||||
|
webhook_type="wix_form",
|
||||||
|
)
|
||||||
|
assert data.hotel_id == "hotel123"
|
||||||
|
assert data.webhook_secret == "secret_abc123"
|
||||||
|
assert data.webhook_type == "wix_form"
|
||||||
|
assert data.is_enabled is True
|
||||||
|
assert isinstance(data.created_at, datetime)
|
||||||
|
|
||||||
|
def test_webhook_endpoint_with_description(self):
|
||||||
|
"""Test WebhookEndpointData with optional description."""
|
||||||
|
data = WebhookEndpointData(
|
||||||
|
hotel_id="hotel123",
|
||||||
|
webhook_secret="secret_abc123",
|
||||||
|
webhook_type="generic",
|
||||||
|
description="Main booking form",
|
||||||
|
)
|
||||||
|
assert data.description == "Main booking form"
|
||||||
|
|
||||||
|
def test_whitespace_stripping(self):
|
||||||
|
"""Test that whitespace is stripped from string fields."""
|
||||||
|
data = WebhookEndpointData(
|
||||||
|
hotel_id=" hotel123 ",
|
||||||
|
webhook_secret=" secret_abc123 ",
|
||||||
|
webhook_type=" wix_form ",
|
||||||
|
)
|
||||||
|
assert data.hotel_id == "hotel123"
|
||||||
|
assert data.webhook_secret == "secret_abc123"
|
||||||
|
assert data.webhook_type == "wix_form"
|
||||||
|
|
||||||
|
|
||||||
|
class TestWebhookRequestData:
|
||||||
|
"""Tests for WebhookRequestData schema."""
|
||||||
|
|
||||||
|
def test_auto_calculate_payload_hash(self):
|
||||||
|
"""Test that payload_hash is auto-calculated from payload_json."""
|
||||||
|
payload = {"name": "John", "email": "john@example.com"}
|
||||||
|
data = WebhookRequestData(payload_json=payload)
|
||||||
|
|
||||||
|
# Verify hash was calculated
|
||||||
|
assert data.payload_hash is not None
|
||||||
|
assert len(data.payload_hash) == 64 # SHA256 produces 64 hex chars
|
||||||
|
|
||||||
|
# Verify it matches the expected hash (same algorithm as api.py)
|
||||||
|
payload_json_str = json.dumps(payload, sort_keys=True)
|
||||||
|
expected_hash = hashlib.sha256(payload_json_str.encode("utf-8")).hexdigest()
|
||||||
|
assert data.payload_hash == expected_hash
|
||||||
|
|
||||||
|
def test_explicit_payload_hash(self):
|
||||||
|
"""Test providing payload_hash explicitly (for purged payloads)."""
|
||||||
|
explicit_hash = "a" * 64
|
||||||
|
data = WebhookRequestData(
|
||||||
|
payload_json=None,
|
||||||
|
payload_hash=explicit_hash,
|
||||||
|
)
|
||||||
|
assert data.payload_hash == explicit_hash
|
||||||
|
assert data.payload_json is None
|
||||||
|
|
||||||
|
def test_payload_hash_required(self):
|
||||||
|
"""Test that payload_hash is required (either calculated or explicit)."""
|
||||||
|
with pytest.raises(ValidationError) as exc_info:
|
||||||
|
WebhookRequestData(
|
||||||
|
payload_json=None,
|
||||||
|
payload_hash=None,
|
||||||
|
)
|
||||||
|
assert "payload_hash is required" in str(exc_info.value)
|
||||||
|
|
||||||
|
def test_consistent_hashing(self):
|
||||||
|
"""Test that the same payload always produces the same hash."""
|
||||||
|
payload = {"b": 2, "a": 1, "c": 3} # Unordered keys
|
||||||
|
|
||||||
|
data1 = WebhookRequestData(payload_json=payload.copy())
|
||||||
|
data2 = WebhookRequestData(payload_json=payload.copy())
|
||||||
|
|
||||||
|
assert data1.payload_hash == data2.payload_hash
|
||||||
|
|
||||||
|
def test_default_status(self):
|
||||||
|
"""Test that status defaults to PENDING."""
|
||||||
|
data = WebhookRequestData(payload_json={"test": "data"})
|
||||||
|
assert data.status == WebhookStatus.PENDING
|
||||||
|
|
||||||
|
def test_status_normalization(self):
|
||||||
|
"""Test that status is normalized to WebhookStatus enum."""
|
||||||
|
data = WebhookRequestData(
|
||||||
|
payload_json={"test": "data"},
|
||||||
|
status="completed", # String
|
||||||
|
)
|
||||||
|
assert data.status == WebhookStatus.COMPLETED
|
||||||
|
assert isinstance(data.status, WebhookStatus)
|
||||||
|
|
||||||
|
def test_retry_count_default(self):
|
||||||
|
"""Test that retry_count defaults to 0."""
|
||||||
|
data = WebhookRequestData(payload_json={"test": "data"})
|
||||||
|
assert data.retry_count == 0
|
||||||
|
|
||||||
|
def test_optional_foreign_keys(self):
|
||||||
|
"""Test optional foreign key fields."""
|
||||||
|
data = WebhookRequestData(
|
||||||
|
payload_json={"test": "data"},
|
||||||
|
webhook_endpoint_id=123,
|
||||||
|
hotel_id="hotel456",
|
||||||
|
)
|
||||||
|
assert data.webhook_endpoint_id == 123
|
||||||
|
assert data.hotel_id == "hotel456"
|
||||||
|
|
||||||
|
def test_result_tracking(self):
|
||||||
|
"""Test result tracking fields."""
|
||||||
|
data = WebhookRequestData(
|
||||||
|
payload_json={"test": "data"},
|
||||||
|
created_customer_id=1,
|
||||||
|
created_reservation_id=2,
|
||||||
|
)
|
||||||
|
assert data.created_customer_id == 1
|
||||||
|
assert data.created_reservation_id == 2
|
||||||
|
|
||||||
|
def test_purged_payload(self):
|
||||||
|
"""Test representing a purged webhook request (after processing)."""
|
||||||
|
explicit_hash = "b" * 64
|
||||||
|
data = WebhookRequestData(
|
||||||
|
payload_json=None,
|
||||||
|
payload_hash=explicit_hash,
|
||||||
|
status=WebhookStatus.COMPLETED,
|
||||||
|
purged_at=datetime.now(),
|
||||||
|
)
|
||||||
|
assert data.payload_json is None
|
||||||
|
assert data.payload_hash == explicit_hash
|
||||||
|
assert data.status == WebhookStatus.COMPLETED
|
||||||
|
assert data.purged_at is not None
|
||||||
|
|
||||||
|
def test_processing_metadata(self):
|
||||||
|
"""Test processing tracking fields."""
|
||||||
|
now = datetime.now()
|
||||||
|
data = WebhookRequestData(
|
||||||
|
payload_json={"test": "data"},
|
||||||
|
status=WebhookStatus.PROCESSING,
|
||||||
|
processing_started_at=now,
|
||||||
|
)
|
||||||
|
assert data.status == WebhookStatus.PROCESSING
|
||||||
|
assert data.processing_started_at == now
|
||||||
|
assert data.processing_completed_at is None
|
||||||
|
|
||||||
|
def test_request_metadata(self):
|
||||||
|
"""Test request metadata fields."""
|
||||||
|
data = WebhookRequestData(
|
||||||
|
payload_json={"test": "data"},
|
||||||
|
source_ip="192.168.1.1",
|
||||||
|
user_agent="Mozilla/5.0",
|
||||||
|
)
|
||||||
|
assert data.source_ip == "192.168.1.1"
|
||||||
|
assert data.user_agent == "Mozilla/5.0"
|
||||||
340
tests/test_webhook_duplicates.py
Normal file
340
tests/test_webhook_duplicates.py
Normal file
@@ -0,0 +1,340 @@
|
|||||||
|
"""Tests for webhook duplicate handling and reprocessing.
|
||||||
|
|
||||||
|
This module tests:
|
||||||
|
- Duplicate detection during normal operation
|
||||||
|
- Duplicate handling during app startup reprocessing
|
||||||
|
- Stuck webhooks that are duplicates
|
||||||
|
"""
|
||||||
|
|
||||||
|
import asyncio
|
||||||
|
import json
|
||||||
|
import uuid
|
||||||
|
from datetime import UTC, datetime
|
||||||
|
from pathlib import Path
|
||||||
|
from unittest.mock import patch
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
import pytest_asyncio
|
||||||
|
from fastapi.testclient import TestClient
|
||||||
|
from sqlalchemy import select
|
||||||
|
from sqlalchemy.ext.asyncio import AsyncSession, async_sessionmaker, create_async_engine
|
||||||
|
|
||||||
|
from alpine_bits_python.api import app
|
||||||
|
from alpine_bits_python.const import WebhookStatus
|
||||||
|
from alpine_bits_python.db import Base, Reservation, WebhookRequest
|
||||||
|
from alpine_bits_python.db_setup import reprocess_stuck_webhooks
|
||||||
|
from alpine_bits_python.schemas import WebhookRequestData
|
||||||
|
from alpine_bits_python.webhook_processor import initialize_webhook_processors, webhook_registry
|
||||||
|
|
||||||
|
|
||||||
|
@pytest_asyncio.fixture
|
||||||
|
async def test_db_engine():
|
||||||
|
"""Create an in-memory SQLite database for testing."""
|
||||||
|
engine = create_async_engine(
|
||||||
|
"sqlite+aiosqlite:///:memory:",
|
||||||
|
echo=False,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Create tables
|
||||||
|
async with engine.begin() as conn:
|
||||||
|
await conn.run_sync(Base.metadata.create_all)
|
||||||
|
|
||||||
|
yield engine
|
||||||
|
|
||||||
|
# Cleanup
|
||||||
|
await engine.dispose()
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def test_config():
|
||||||
|
"""Test configuration."""
|
||||||
|
return {
|
||||||
|
"server": {
|
||||||
|
"codecontext": "ADVERTISING",
|
||||||
|
"code": "70597314",
|
||||||
|
"companyname": "99tales Gmbh",
|
||||||
|
"res_id_source_context": "99tales",
|
||||||
|
},
|
||||||
|
"alpine_bits_auth": [
|
||||||
|
{
|
||||||
|
"hotel_id": "HOTEL123",
|
||||||
|
"hotel_name": "Test Hotel",
|
||||||
|
"username": "testuser",
|
||||||
|
"password": "testpass",
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"default_hotel_code": "HOTEL123",
|
||||||
|
"default_hotel_name": "Test Hotel",
|
||||||
|
"database": {"url": "sqlite+aiosqlite:///:memory:"},
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def sample_wix_form_data():
|
||||||
|
"""Sample Wix form submission data with FIXED submissionId for duplicate testing."""
|
||||||
|
return {
|
||||||
|
"data": {
|
||||||
|
"submissionId": "FIXED-DUPLICATE-TEST-ID", # Fixed ID to trigger duplicates
|
||||||
|
"submissionTime": "2025-10-07T05:48:41.855Z",
|
||||||
|
"contact": {
|
||||||
|
"name": {"first": "John", "last": "Doe"},
|
||||||
|
"email": "john.doe.duplicate.test@example.com",
|
||||||
|
"phones": [{"e164Phone": "+1234567890"}],
|
||||||
|
"locale": "en-US",
|
||||||
|
"contactId": "contact-duplicate-test",
|
||||||
|
},
|
||||||
|
"field:anrede": "Mr.",
|
||||||
|
"field:form_field_5a7b": True,
|
||||||
|
"field:date_picker_a7c8": "2024-12-25",
|
||||||
|
"field:date_picker_7e65": "2024-12-31",
|
||||||
|
"field:number_7cf5": "2",
|
||||||
|
"field:anzahl_kinder": "1",
|
||||||
|
"field:alter_kind_1": "8",
|
||||||
|
"field:angebot_auswaehlen": "Christmas Special",
|
||||||
|
"field:utm_source": "google",
|
||||||
|
"field:utm_medium": "cpc",
|
||||||
|
"field:utm_campaign": "winter2024",
|
||||||
|
"field:fbclid": "test_fbclid_123",
|
||||||
|
"field:long_answer_3524": "Late check-in please",
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
class TestWebhookDuplicateHandling:
|
||||||
|
"""Test duplicate webhook handling during normal operation."""
|
||||||
|
|
||||||
|
def test_duplicate_webhook_during_operation(self, test_config, sample_wix_form_data):
|
||||||
|
"""Test that sending the same webhook twice handles duplicates gracefully."""
|
||||||
|
# Create engine and tables
|
||||||
|
engine = create_async_engine(
|
||||||
|
"sqlite+aiosqlite:///:memory:",
|
||||||
|
echo=False,
|
||||||
|
)
|
||||||
|
|
||||||
|
async def create_tables():
|
||||||
|
async with engine.begin() as conn:
|
||||||
|
await conn.run_sync(Base.metadata.create_all)
|
||||||
|
|
||||||
|
asyncio.run(create_tables())
|
||||||
|
|
||||||
|
# Mock config and database to use our test database
|
||||||
|
with patch("alpine_bits_python.api.load_config", return_value=test_config), \
|
||||||
|
patch("alpine_bits_python.api.create_database_engine", return_value=engine):
|
||||||
|
|
||||||
|
from alpine_bits_python.alpinebits_server import AlpineBitsServer
|
||||||
|
|
||||||
|
# Setup app state
|
||||||
|
app.state.engine = engine
|
||||||
|
app.state.async_sessionmaker = async_sessionmaker(
|
||||||
|
engine, expire_on_commit=False
|
||||||
|
)
|
||||||
|
app.state.config = test_config
|
||||||
|
app.state.alpine_bits_server = AlpineBitsServer(test_config)
|
||||||
|
|
||||||
|
with TestClient(app) as client:
|
||||||
|
# First submission - should succeed
|
||||||
|
response1 = client.post(
|
||||||
|
"/api/webhook/wix-form",
|
||||||
|
json=sample_wix_form_data
|
||||||
|
)
|
||||||
|
assert response1.status_code == 200
|
||||||
|
data1 = response1.json()
|
||||||
|
assert data1["status"] == "success"
|
||||||
|
|
||||||
|
# Second submission with same data - should detect duplicate at API level
|
||||||
|
response2 = client.post(
|
||||||
|
"/api/webhook/wix-form",
|
||||||
|
json=sample_wix_form_data
|
||||||
|
)
|
||||||
|
assert response2.status_code == 200
|
||||||
|
data2 = response2.json()
|
||||||
|
# API returns success for already-processed webhooks, but sets duplicate flag
|
||||||
|
assert data2["status"] == "success"
|
||||||
|
assert data2.get("duplicate") is True
|
||||||
|
assert "already processed" in data2["message"].lower()
|
||||||
|
|
||||||
|
# Cleanup
|
||||||
|
asyncio.run(engine.dispose())
|
||||||
|
|
||||||
|
|
||||||
|
class TestWebhookReprocessing:
|
||||||
|
"""Test webhook reprocessing on app restart."""
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_reprocess_stuck_duplicate_webhook(self, test_db_engine, test_config):
|
||||||
|
"""Test that stuck webhooks that are duplicates are handled correctly on restart."""
|
||||||
|
AsyncSessionLocal = async_sessionmaker(test_db_engine, expire_on_commit=False)
|
||||||
|
|
||||||
|
# Step 1: Process a webhook normally to create a reservation
|
||||||
|
from alpine_bits_python.webhook_processor import process_wix_form_submission
|
||||||
|
|
||||||
|
test_form_file = Path(__file__).parent / "test_data" / f"test_form{1}.json"
|
||||||
|
|
||||||
|
if not test_form_file.exists():
|
||||||
|
pytest.skip(f"{test_form_file.name} not found")
|
||||||
|
|
||||||
|
# Load test form data
|
||||||
|
with test_form_file.open() as f:
|
||||||
|
test_data = json.load(f)
|
||||||
|
|
||||||
|
test_data["data"]["submissionId"] = "STUCK-WEBHOOK-TEST-ID" # Fixed ID for duplicate test
|
||||||
|
|
||||||
|
async with AsyncSessionLocal() as session:
|
||||||
|
result = await process_wix_form_submission(
|
||||||
|
test_data, session, config=test_config
|
||||||
|
)
|
||||||
|
await session.commit()
|
||||||
|
assert result["status"] == "success"
|
||||||
|
|
||||||
|
# Step 2: Verify the reservation was created
|
||||||
|
async with AsyncSessionLocal() as session:
|
||||||
|
stmt = select(Reservation).where(
|
||||||
|
Reservation.unique_id == "STUCK-WEBHOOK-TEST-ID"
|
||||||
|
)
|
||||||
|
result = await session.execute(stmt)
|
||||||
|
reservation = result.scalar_one_or_none()
|
||||||
|
assert reservation is not None, "Reservation should exist"
|
||||||
|
assert reservation.unique_id == "STUCK-WEBHOOK-TEST-ID"
|
||||||
|
|
||||||
|
# Step 3: Manually create a webhook request stuck in "processing" status
|
||||||
|
# This simulates a webhook that was being processed when the app crashed
|
||||||
|
from alpine_bits_python.db import WebhookEndpoint, Hotel
|
||||||
|
|
||||||
|
async with AsyncSessionLocal() as session:
|
||||||
|
# Create hotel
|
||||||
|
hotel = Hotel(
|
||||||
|
hotel_id="HOTEL123",
|
||||||
|
hotel_name="Test Hotel",
|
||||||
|
username="testuser",
|
||||||
|
password_hash="dummy",
|
||||||
|
created_at=datetime.now(UTC),
|
||||||
|
updated_at=datetime.now(UTC),
|
||||||
|
is_active=True,
|
||||||
|
)
|
||||||
|
session.add(hotel)
|
||||||
|
await session.flush()
|
||||||
|
|
||||||
|
# Create webhook endpoint
|
||||||
|
endpoint = WebhookEndpoint(
|
||||||
|
hotel_id="HOTEL123",
|
||||||
|
webhook_type="wix_form",
|
||||||
|
webhook_secret="test-secret-123",
|
||||||
|
is_enabled=True,
|
||||||
|
created_at=datetime.now(UTC),
|
||||||
|
)
|
||||||
|
session.add(endpoint)
|
||||||
|
await session.flush()
|
||||||
|
|
||||||
|
# Create stuck webhook request with the SAME payload
|
||||||
|
stuck_webhook_data = WebhookRequestData(
|
||||||
|
webhook_endpoint_id=endpoint.id,
|
||||||
|
hotel_id="HOTEL123",
|
||||||
|
payload_json=test_data,
|
||||||
|
status=WebhookStatus.PROCESSING, # Stuck in processing!
|
||||||
|
created_at=datetime.now(UTC),
|
||||||
|
)
|
||||||
|
|
||||||
|
stuck_webhook = WebhookRequest(**stuck_webhook_data.model_dump())
|
||||||
|
session.add(stuck_webhook)
|
||||||
|
await session.commit()
|
||||||
|
|
||||||
|
# initialize wix_form processor
|
||||||
|
|
||||||
|
initialize_webhook_processors()
|
||||||
|
|
||||||
|
|
||||||
|
# Step 4: Run reprocessing (simulates app restart)
|
||||||
|
await reprocess_stuck_webhooks(AsyncSessionLocal, test_config)
|
||||||
|
|
||||||
|
# Step 5: Verify the stuck webhook was marked as completed (not failed)
|
||||||
|
async with AsyncSessionLocal() as session:
|
||||||
|
stmt = select(WebhookRequest).where(
|
||||||
|
WebhookRequest.status == WebhookStatus.COMPLETED
|
||||||
|
)
|
||||||
|
result = await session.execute(stmt)
|
||||||
|
completed_webhooks = result.scalars().all()
|
||||||
|
assert len(completed_webhooks) == 1
|
||||||
|
assert completed_webhooks[0].last_error is None
|
||||||
|
|
||||||
|
# Verify no failed webhooks
|
||||||
|
stmt = select(WebhookRequest).where(
|
||||||
|
WebhookRequest.status == WebhookStatus.FAILED
|
||||||
|
)
|
||||||
|
result = await session.execute(stmt)
|
||||||
|
failed_webhooks = result.scalars().all()
|
||||||
|
assert len(failed_webhooks) == 0
|
||||||
|
|
||||||
|
# Step 6: Verify only ONE reservation exists (no duplicate)
|
||||||
|
async with AsyncSessionLocal() as session:
|
||||||
|
stmt = select(Reservation)
|
||||||
|
result = await session.execute(stmt)
|
||||||
|
reservations = result.scalars().all()
|
||||||
|
assert len(reservations) == 1
|
||||||
|
|
||||||
|
|
||||||
|
class TestWebhookReprocessingNeverBlocksStartup:
|
||||||
|
"""Test that reprocessing never blocks app startup."""
|
||||||
|
|
||||||
|
@pytest.mark.asyncio
|
||||||
|
async def test_reprocessing_error_does_not_block_startup(
|
||||||
|
self, test_db_engine, test_config
|
||||||
|
):
|
||||||
|
"""Test that even if reprocessing fails, app startup continues."""
|
||||||
|
AsyncSessionLocal = async_sessionmaker(test_db_engine, expire_on_commit=False)
|
||||||
|
|
||||||
|
from alpine_bits_python.db import WebhookEndpoint, Hotel
|
||||||
|
|
||||||
|
# Create a stuck webhook with invalid data that will cause processing to fail
|
||||||
|
async with AsyncSessionLocal() as session:
|
||||||
|
# Create hotel
|
||||||
|
hotel = Hotel(
|
||||||
|
hotel_id="HOTEL123",
|
||||||
|
hotel_name="Test Hotel",
|
||||||
|
username="testuser",
|
||||||
|
password_hash="dummy",
|
||||||
|
created_at=datetime.now(UTC),
|
||||||
|
updated_at=datetime.now(UTC),
|
||||||
|
is_active=True,
|
||||||
|
)
|
||||||
|
session.add(hotel)
|
||||||
|
await session.flush()
|
||||||
|
|
||||||
|
# Create webhook endpoint
|
||||||
|
endpoint = WebhookEndpoint(
|
||||||
|
hotel_id="HOTEL123",
|
||||||
|
webhook_type="wix_form",
|
||||||
|
webhook_secret="test-secret-123",
|
||||||
|
is_enabled=True,
|
||||||
|
created_at=datetime.now(UTC),
|
||||||
|
)
|
||||||
|
session.add(endpoint)
|
||||||
|
await session.flush()
|
||||||
|
|
||||||
|
webhook_request = WebhookRequestData(
|
||||||
|
hotel_id="HOTEL123",
|
||||||
|
payload_json={"data": {"invalid": "data"}}, # Missing required fields
|
||||||
|
status=WebhookStatus.PROCESSING
|
||||||
|
)
|
||||||
|
stuck_webhook = WebhookRequest(**webhook_request.model_dump())
|
||||||
|
|
||||||
|
|
||||||
|
session.add(stuck_webhook) ## Cannot add the stuck webhook. Integrity Error payload_hash is missing
|
||||||
|
await session.commit()
|
||||||
|
|
||||||
|
# This should NOT raise an exception - it should log and continue
|
||||||
|
try:
|
||||||
|
await reprocess_stuck_webhooks(AsyncSessionLocal, test_config)
|
||||||
|
except Exception as e:
|
||||||
|
pytest.fail(
|
||||||
|
f"reprocess_stuck_webhooks should NEVER raise exceptions, but got: {e}"
|
||||||
|
)
|
||||||
|
|
||||||
|
# Verify the webhook was marked as failed
|
||||||
|
async with AsyncSessionLocal() as session:
|
||||||
|
stmt = select(WebhookRequest).where(
|
||||||
|
WebhookRequest.status == WebhookStatus.FAILED
|
||||||
|
)
|
||||||
|
result = await session.execute(stmt)
|
||||||
|
failed_webhooks = result.scalars().all()
|
||||||
|
assert len(failed_webhooks) == 1
|
||||||
|
assert failed_webhooks[0].last_error is not None
|
||||||
327
tests/test_xml_builders.py
Normal file
327
tests/test_xml_builders.py
Normal file
@@ -0,0 +1,327 @@
|
|||||||
|
"""Tests for XML builder helpers."""
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
from xml.etree import ElementTree as ET
|
||||||
|
|
||||||
|
from tests.helpers.xml_builders import (
|
||||||
|
ReservationXMLBuilder,
|
||||||
|
MultiReservationXMLBuilder,
|
||||||
|
RoomReservationBuilder,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class TestRoomReservationBuilder:
|
||||||
|
"""Test RoomReservationBuilder functionality."""
|
||||||
|
|
||||||
|
def test_basic_room_without_revenue(self):
|
||||||
|
"""Test creating a basic room reservation without revenue."""
|
||||||
|
builder = RoomReservationBuilder(
|
||||||
|
arrival="2025-12-01",
|
||||||
|
departure="2025-12-03",
|
||||||
|
room_type="DZV",
|
||||||
|
room_number="101",
|
||||||
|
)
|
||||||
|
|
||||||
|
elem = builder.build()
|
||||||
|
|
||||||
|
assert elem.tag == "roomReservation"
|
||||||
|
assert elem.get("arrival") == "2025-12-01"
|
||||||
|
assert elem.get("departure") == "2025-12-03"
|
||||||
|
assert elem.get("roomType") == "DZV"
|
||||||
|
assert elem.get("roomNumber") == "101"
|
||||||
|
|
||||||
|
# Check daily sales - should have 3 entries (12-01, 12-02, 12-03)
|
||||||
|
daily_sales = elem.find("dailySales")
|
||||||
|
assert daily_sales is not None
|
||||||
|
daily_sale_elements = daily_sales.findall("dailySale")
|
||||||
|
assert len(daily_sale_elements) == 3
|
||||||
|
|
||||||
|
# First two should have no revenue attributes
|
||||||
|
assert daily_sale_elements[0].get("revenueTotal") is None
|
||||||
|
assert daily_sale_elements[0].get("revenueLogis") is None
|
||||||
|
|
||||||
|
def test_room_with_revenue(self):
|
||||||
|
"""Test creating a room with revenue per day."""
|
||||||
|
builder = RoomReservationBuilder(
|
||||||
|
arrival="2025-12-01",
|
||||||
|
departure="2025-12-03",
|
||||||
|
room_type="DZV",
|
||||||
|
room_number="101",
|
||||||
|
revenue_logis_per_day=150.0,
|
||||||
|
)
|
||||||
|
|
||||||
|
elem = builder.build()
|
||||||
|
daily_sales = elem.find("dailySales")
|
||||||
|
daily_sale_elements = daily_sales.findall("dailySale")
|
||||||
|
|
||||||
|
# Should have 3 entries total
|
||||||
|
assert len(daily_sale_elements) == 3
|
||||||
|
|
||||||
|
# First two days should have revenue
|
||||||
|
assert daily_sale_elements[0].get("revenueTotal") == "150.0"
|
||||||
|
assert daily_sale_elements[0].get("revenueLogis") == "150.0"
|
||||||
|
assert daily_sale_elements[1].get("revenueTotal") == "150.0"
|
||||||
|
assert daily_sale_elements[1].get("revenueLogis") == "150.0"
|
||||||
|
|
||||||
|
# Departure day should have no revenue
|
||||||
|
assert daily_sale_elements[2].get("revenueTotal") is None
|
||||||
|
assert daily_sale_elements[2].get("revenueLogis") is None
|
||||||
|
|
||||||
|
def test_room_with_children_and_infants(self):
|
||||||
|
"""Test room with children and infants attributes."""
|
||||||
|
builder = RoomReservationBuilder(
|
||||||
|
arrival="2025-12-01",
|
||||||
|
departure="2025-12-02",
|
||||||
|
adults=2,
|
||||||
|
children=1,
|
||||||
|
infants=1,
|
||||||
|
)
|
||||||
|
|
||||||
|
elem = builder.build()
|
||||||
|
assert elem.get("adults") == "2"
|
||||||
|
assert elem.get("children") == "1"
|
||||||
|
assert elem.get("infants") == "1"
|
||||||
|
|
||||||
|
|
||||||
|
class TestReservationXMLBuilder:
|
||||||
|
"""Test ReservationXMLBuilder functionality."""
|
||||||
|
|
||||||
|
def test_basic_reservation(self):
|
||||||
|
"""Test creating a basic reservation with one room."""
|
||||||
|
builder = ReservationXMLBuilder(
|
||||||
|
hotel_id="39054_001",
|
||||||
|
reservation_id="12345",
|
||||||
|
reservation_number="RES-001",
|
||||||
|
reservation_date="2025-11-14",
|
||||||
|
)
|
||||||
|
builder.set_guest(
|
||||||
|
guest_id="guest_001",
|
||||||
|
first_name="John",
|
||||||
|
last_name="Doe",
|
||||||
|
email="john@example.com",
|
||||||
|
)
|
||||||
|
builder.add_room(
|
||||||
|
arrival="2025-12-01",
|
||||||
|
departure="2025-12-05",
|
||||||
|
revenue_logis_per_day=150.0,
|
||||||
|
)
|
||||||
|
|
||||||
|
xml_string = builder.build_xml()
|
||||||
|
|
||||||
|
# Parse and verify structure
|
||||||
|
root = ET.fromstring(xml_string)
|
||||||
|
assert root.tag == "reservations"
|
||||||
|
|
||||||
|
reservation = root.find("reservation")
|
||||||
|
assert reservation is not None
|
||||||
|
assert reservation.get("hotelID") == "39054_001"
|
||||||
|
assert reservation.get("id") == "12345"
|
||||||
|
assert reservation.get("number") == "RES-001"
|
||||||
|
|
||||||
|
guest = reservation.find("guest")
|
||||||
|
assert guest is not None
|
||||||
|
assert guest.get("firstName") == "John"
|
||||||
|
assert guest.get("lastName") == "Doe"
|
||||||
|
assert guest.get("email") == "john@example.com"
|
||||||
|
|
||||||
|
room_reservations = reservation.find("roomReservations")
|
||||||
|
assert room_reservations is not None
|
||||||
|
rooms = room_reservations.findall("roomReservation")
|
||||||
|
assert len(rooms) == 1
|
||||||
|
|
||||||
|
def test_reservation_with_multiple_rooms(self):
|
||||||
|
"""Test reservation with multiple rooms."""
|
||||||
|
builder = ReservationXMLBuilder(
|
||||||
|
hotel_id="39054_001",
|
||||||
|
reservation_id="12345",
|
||||||
|
reservation_number="RES-001",
|
||||||
|
reservation_date="2025-11-14",
|
||||||
|
)
|
||||||
|
builder.set_guest(
|
||||||
|
guest_id="guest_001",
|
||||||
|
first_name="John",
|
||||||
|
last_name="Doe",
|
||||||
|
email="john@example.com",
|
||||||
|
)
|
||||||
|
builder.add_room(
|
||||||
|
arrival="2025-12-01",
|
||||||
|
departure="2025-12-05",
|
||||||
|
room_number="101",
|
||||||
|
revenue_logis_per_day=150.0,
|
||||||
|
)
|
||||||
|
builder.add_room(
|
||||||
|
arrival="2025-12-01",
|
||||||
|
departure="2025-12-05",
|
||||||
|
room_number="102",
|
||||||
|
revenue_logis_per_day=200.0,
|
||||||
|
)
|
||||||
|
|
||||||
|
xml_string = builder.build_xml()
|
||||||
|
root = ET.fromstring(xml_string)
|
||||||
|
|
||||||
|
reservation = root.find("reservation")
|
||||||
|
room_reservations = reservation.find("roomReservations")
|
||||||
|
rooms = room_reservations.findall("roomReservation")
|
||||||
|
|
||||||
|
assert len(rooms) == 2
|
||||||
|
assert rooms[0].get("roomNumber") == "101"
|
||||||
|
assert rooms[1].get("roomNumber") == "102"
|
||||||
|
|
||||||
|
def test_reservation_with_advertising_data(self):
|
||||||
|
"""Test reservation with advertising campaign data."""
|
||||||
|
builder = ReservationXMLBuilder(
|
||||||
|
hotel_id="39054_001",
|
||||||
|
reservation_id="12345",
|
||||||
|
reservation_number="RES-001",
|
||||||
|
reservation_date="2025-11-14",
|
||||||
|
advertising_medium="99TALES",
|
||||||
|
advertising_partner="google",
|
||||||
|
advertising_campagne="EAIaIQobChMI...",
|
||||||
|
)
|
||||||
|
builder.set_guest(
|
||||||
|
guest_id="guest_001",
|
||||||
|
first_name="John",
|
||||||
|
last_name="Doe",
|
||||||
|
email="john@example.com",
|
||||||
|
)
|
||||||
|
builder.add_room(
|
||||||
|
arrival="2025-12-01",
|
||||||
|
departure="2025-12-05",
|
||||||
|
)
|
||||||
|
|
||||||
|
xml_string = builder.build_xml()
|
||||||
|
root = ET.fromstring(xml_string)
|
||||||
|
|
||||||
|
reservation = root.find("reservation")
|
||||||
|
assert reservation.get("advertisingMedium") == "99TALES"
|
||||||
|
assert reservation.get("advertisingPartner") == "google"
|
||||||
|
assert reservation.get("advertisingCampagne") == "EAIaIQobChMI..."
|
||||||
|
|
||||||
|
|
||||||
|
class TestMultiReservationXMLBuilder:
|
||||||
|
"""Test MultiReservationXMLBuilder functionality."""
|
||||||
|
|
||||||
|
def test_multiple_reservations(self):
|
||||||
|
"""Test creating XML with multiple reservations."""
|
||||||
|
multi_builder = MultiReservationXMLBuilder()
|
||||||
|
|
||||||
|
# Add first reservation
|
||||||
|
res1 = ReservationXMLBuilder(
|
||||||
|
hotel_id="39054_001",
|
||||||
|
reservation_id="12345",
|
||||||
|
reservation_number="RES-001",
|
||||||
|
reservation_date="2025-11-14",
|
||||||
|
)
|
||||||
|
res1.set_guest(
|
||||||
|
guest_id="guest_001",
|
||||||
|
first_name="John",
|
||||||
|
last_name="Doe",
|
||||||
|
email="john@example.com",
|
||||||
|
)
|
||||||
|
res1.add_room(
|
||||||
|
arrival="2025-12-01",
|
||||||
|
departure="2025-12-03",
|
||||||
|
revenue_logis_per_day=150.0,
|
||||||
|
)
|
||||||
|
multi_builder.add_reservation(res1)
|
||||||
|
|
||||||
|
# Add second reservation
|
||||||
|
res2 = ReservationXMLBuilder(
|
||||||
|
hotel_id="39054_001",
|
||||||
|
reservation_id="12346",
|
||||||
|
reservation_number="RES-002",
|
||||||
|
reservation_date="2025-11-15",
|
||||||
|
)
|
||||||
|
res2.set_guest(
|
||||||
|
guest_id="guest_002",
|
||||||
|
first_name="Jane",
|
||||||
|
last_name="Smith",
|
||||||
|
email="jane@example.com",
|
||||||
|
)
|
||||||
|
res2.add_room(
|
||||||
|
arrival="2025-12-10",
|
||||||
|
departure="2025-12-12",
|
||||||
|
revenue_logis_per_day=200.0,
|
||||||
|
)
|
||||||
|
multi_builder.add_reservation(res2)
|
||||||
|
|
||||||
|
xml_string = multi_builder.build_xml()
|
||||||
|
root = ET.fromstring(xml_string)
|
||||||
|
|
||||||
|
assert root.tag == "reservations"
|
||||||
|
reservations = root.findall("reservation")
|
||||||
|
assert len(reservations) == 2
|
||||||
|
assert reservations[0].get("id") == "12345"
|
||||||
|
assert reservations[1].get("id") == "12346"
|
||||||
|
|
||||||
|
|
||||||
|
class TestConvenienceFeatures:
|
||||||
|
"""Test convenience features for common test scenarios."""
|
||||||
|
|
||||||
|
def test_simple_one_liner_reservation(self):
|
||||||
|
"""Test creating a simple reservation in a fluent style."""
|
||||||
|
xml = (
|
||||||
|
ReservationXMLBuilder(
|
||||||
|
hotel_id="39054_001",
|
||||||
|
reservation_id="12345",
|
||||||
|
reservation_number="RES-001",
|
||||||
|
reservation_date="2025-11-14",
|
||||||
|
)
|
||||||
|
.set_guest(
|
||||||
|
guest_id="guest_001",
|
||||||
|
first_name="John",
|
||||||
|
last_name="Doe",
|
||||||
|
email="john@example.com",
|
||||||
|
)
|
||||||
|
.add_room(
|
||||||
|
arrival="2025-12-01",
|
||||||
|
departure="2025-12-05",
|
||||||
|
revenue_logis_per_day=160.0,
|
||||||
|
)
|
||||||
|
.build_xml()
|
||||||
|
)
|
||||||
|
|
||||||
|
assert '<?xml version="1.0" ?>' in xml
|
||||||
|
assert 'hotelID="39054_001"' in xml
|
||||||
|
assert 'revenueLogis="160.0"' in xml
|
||||||
|
|
||||||
|
def test_revenue_calculation_for_multi_day_stay(self):
|
||||||
|
"""Test that daily sales are correctly generated for multi-day stays."""
|
||||||
|
builder = ReservationXMLBuilder(
|
||||||
|
hotel_id="39054_001",
|
||||||
|
reservation_id="12345",
|
||||||
|
reservation_number="RES-001",
|
||||||
|
reservation_date="2025-11-14",
|
||||||
|
)
|
||||||
|
builder.set_guest(
|
||||||
|
guest_id="guest_001",
|
||||||
|
first_name="John",
|
||||||
|
last_name="Doe",
|
||||||
|
email="john@example.com",
|
||||||
|
)
|
||||||
|
# 7-day stay (June 25 - July 2, 7 nights)
|
||||||
|
builder.add_room(
|
||||||
|
arrival="2026-06-25",
|
||||||
|
departure="2026-07-02",
|
||||||
|
revenue_logis_per_day=160.0,
|
||||||
|
)
|
||||||
|
|
||||||
|
elem = builder.build()
|
||||||
|
room_reservations = elem.find("roomReservations")
|
||||||
|
room = room_reservations.find("roomReservation")
|
||||||
|
daily_sales = room.find("dailySales")
|
||||||
|
daily_sale_elements = daily_sales.findall("dailySale")
|
||||||
|
|
||||||
|
# Should have 8 daily sale entries (7 nights + departure day)
|
||||||
|
assert len(daily_sale_elements) == 8
|
||||||
|
|
||||||
|
# First 7 should have revenue
|
||||||
|
for i in range(7):
|
||||||
|
assert daily_sale_elements[i].get("revenueLogis") == "160.0"
|
||||||
|
|
||||||
|
# Departure day should not have revenue
|
||||||
|
assert daily_sale_elements[7].get("revenueLogis") is None
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
pytest.main([__file__, "-v"])
|
||||||
68
uv.lock
generated
68
uv.lock
generated
@@ -37,6 +37,7 @@ dependencies = [
|
|||||||
{ name = "alembic" },
|
{ name = "alembic" },
|
||||||
{ name = "annotatedyaml" },
|
{ name = "annotatedyaml" },
|
||||||
{ name = "asyncpg" },
|
{ name = "asyncpg" },
|
||||||
|
{ name = "bcrypt" },
|
||||||
{ name = "dotenv" },
|
{ name = "dotenv" },
|
||||||
{ name = "fast-langdetect" },
|
{ name = "fast-langdetect" },
|
||||||
{ name = "fastapi" },
|
{ name = "fastapi" },
|
||||||
@@ -69,6 +70,7 @@ requires-dist = [
|
|||||||
{ name = "alembic", specifier = ">=1.17.2" },
|
{ name = "alembic", specifier = ">=1.17.2" },
|
||||||
{ name = "annotatedyaml", specifier = ">=1.0.0" },
|
{ name = "annotatedyaml", specifier = ">=1.0.0" },
|
||||||
{ name = "asyncpg", specifier = ">=0.30.0" },
|
{ name = "asyncpg", specifier = ">=0.30.0" },
|
||||||
|
{ name = "bcrypt", specifier = ">=5.0.0" },
|
||||||
{ name = "dotenv", specifier = ">=0.9.9" },
|
{ name = "dotenv", specifier = ">=0.9.9" },
|
||||||
{ name = "fast-langdetect", specifier = ">=1.0.0" },
|
{ name = "fast-langdetect", specifier = ">=1.0.0" },
|
||||||
{ name = "fastapi", specifier = ">=0.117.1" },
|
{ name = "fastapi", specifier = ">=0.117.1" },
|
||||||
@@ -171,6 +173,72 @@ wheels = [
|
|||||||
{ url = "https://files.pythonhosted.org/packages/c8/a4/cec76b3389c4c5ff66301cd100fe88c318563ec8a520e0b2e792b5b84972/asyncpg-0.30.0-cp313-cp313-win_amd64.whl", hash = "sha256:f59b430b8e27557c3fb9869222559f7417ced18688375825f8f12302c34e915e", size = 621623, upload-time = "2024-10-20T00:30:09.024Z" },
|
{ url = "https://files.pythonhosted.org/packages/c8/a4/cec76b3389c4c5ff66301cd100fe88c318563ec8a520e0b2e792b5b84972/asyncpg-0.30.0-cp313-cp313-win_amd64.whl", hash = "sha256:f59b430b8e27557c3fb9869222559f7417ced18688375825f8f12302c34e915e", size = 621623, upload-time = "2024-10-20T00:30:09.024Z" },
|
||||||
]
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "bcrypt"
|
||||||
|
version = "5.0.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/d4/36/3329e2518d70ad8e2e5817d5a4cac6bba05a47767ec416c7d020a965f408/bcrypt-5.0.0.tar.gz", hash = "sha256:f748f7c2d6fd375cc93d3fba7ef4a9e3a092421b8dbf34d8d4dc06be9492dfdd", size = 25386, upload-time = "2025-09-25T19:50:47.829Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/13/85/3e65e01985fddf25b64ca67275bb5bdb4040bd1a53b66d355c6c37c8a680/bcrypt-5.0.0-cp313-cp313t-macosx_10_12_universal2.whl", hash = "sha256:f3c08197f3039bec79cee59a606d62b96b16669cff3949f21e74796b6e3cd2be", size = 481806, upload-time = "2025-09-25T19:49:05.102Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/44/dc/01eb79f12b177017a726cbf78330eb0eb442fae0e7b3dfd84ea2849552f3/bcrypt-5.0.0-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:200af71bc25f22006f4069060c88ed36f8aa4ff7f53e67ff04d2ab3f1e79a5b2", size = 268626, upload-time = "2025-09-25T19:49:06.723Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/8c/cf/e82388ad5959c40d6afd94fb4743cc077129d45b952d46bdc3180310e2df/bcrypt-5.0.0-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:baade0a5657654c2984468efb7d6c110db87ea63ef5a4b54732e7e337253e44f", size = 271853, upload-time = "2025-09-25T19:49:08.028Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ec/86/7134b9dae7cf0efa85671651341f6afa695857fae172615e960fb6a466fa/bcrypt-5.0.0-cp313-cp313t-manylinux_2_28_aarch64.whl", hash = "sha256:c58b56cdfb03202b3bcc9fd8daee8e8e9b6d7e3163aa97c631dfcfcc24d36c86", size = 269793, upload-time = "2025-09-25T19:49:09.727Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/cc/82/6296688ac1b9e503d034e7d0614d56e80c5d1a08402ff856a4549cb59207/bcrypt-5.0.0-cp313-cp313t-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:4bfd2a34de661f34d0bda43c3e4e79df586e4716ef401fe31ea39d69d581ef23", size = 289930, upload-time = "2025-09-25T19:49:11.204Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d1/18/884a44aa47f2a3b88dd09bc05a1e40b57878ecd111d17e5bba6f09f8bb77/bcrypt-5.0.0-cp313-cp313t-manylinux_2_28_x86_64.whl", hash = "sha256:ed2e1365e31fc73f1825fa830f1c8f8917ca1b3ca6185773b349c20fd606cec2", size = 272194, upload-time = "2025-09-25T19:49:12.524Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/0e/8f/371a3ab33c6982070b674f1788e05b656cfbf5685894acbfef0c65483a59/bcrypt-5.0.0-cp313-cp313t-manylinux_2_34_aarch64.whl", hash = "sha256:83e787d7a84dbbfba6f250dd7a5efd689e935f03dd83b0f919d39349e1f23f83", size = 269381, upload-time = "2025-09-25T19:49:14.308Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b1/34/7e4e6abb7a8778db6422e88b1f06eb07c47682313997ee8a8f9352e5a6f1/bcrypt-5.0.0-cp313-cp313t-manylinux_2_34_x86_64.whl", hash = "sha256:137c5156524328a24b9fac1cb5db0ba618bc97d11970b39184c1d87dc4bf1746", size = 271750, upload-time = "2025-09-25T19:49:15.584Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c0/1b/54f416be2499bd72123c70d98d36c6cd61a4e33d9b89562c22481c81bb30/bcrypt-5.0.0-cp313-cp313t-musllinux_1_1_aarch64.whl", hash = "sha256:38cac74101777a6a7d3b3e3cfefa57089b5ada650dce2baf0cbdd9d65db22a9e", size = 303757, upload-time = "2025-09-25T19:49:17.244Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/13/62/062c24c7bcf9d2826a1a843d0d605c65a755bc98002923d01fd61270705a/bcrypt-5.0.0-cp313-cp313t-musllinux_1_1_x86_64.whl", hash = "sha256:d8d65b564ec849643d9f7ea05c6d9f0cd7ca23bdd4ac0c2dbef1104ab504543d", size = 306740, upload-time = "2025-09-25T19:49:18.693Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d5/c8/1fdbfc8c0f20875b6b4020f3c7dc447b8de60aa0be5faaf009d24242aec9/bcrypt-5.0.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:741449132f64b3524e95cd30e5cd3343006ce146088f074f31ab26b94e6c75ba", size = 334197, upload-time = "2025-09-25T19:49:20.523Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a6/c1/8b84545382d75bef226fbc6588af0f7b7d095f7cd6a670b42a86243183cd/bcrypt-5.0.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:212139484ab3207b1f0c00633d3be92fef3c5f0af17cad155679d03ff2ee1e41", size = 352974, upload-time = "2025-09-25T19:49:22.254Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/10/a6/ffb49d4254ed085e62e3e5dd05982b4393e32fe1e49bb1130186617c29cd/bcrypt-5.0.0-cp313-cp313t-win32.whl", hash = "sha256:9d52ed507c2488eddd6a95bccee4e808d3234fa78dd370e24bac65a21212b861", size = 148498, upload-time = "2025-09-25T19:49:24.134Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/48/a9/259559edc85258b6d5fc5471a62a3299a6aa37a6611a169756bf4689323c/bcrypt-5.0.0-cp313-cp313t-win_amd64.whl", hash = "sha256:f6984a24db30548fd39a44360532898c33528b74aedf81c26cf29c51ee47057e", size = 145853, upload-time = "2025-09-25T19:49:25.702Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/2d/df/9714173403c7e8b245acf8e4be8876aac64a209d1b392af457c79e60492e/bcrypt-5.0.0-cp313-cp313t-win_arm64.whl", hash = "sha256:9fffdb387abe6aa775af36ef16f55e318dcda4194ddbf82007a6f21da29de8f5", size = 139626, upload-time = "2025-09-25T19:49:26.928Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f8/14/c18006f91816606a4abe294ccc5d1e6f0e42304df5a33710e9e8e95416e1/bcrypt-5.0.0-cp314-cp314t-macosx_10_12_universal2.whl", hash = "sha256:4870a52610537037adb382444fefd3706d96d663ac44cbb2f37e3919dca3d7ef", size = 481862, upload-time = "2025-09-25T19:49:28.365Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/67/49/dd074d831f00e589537e07a0725cf0e220d1f0d5d8e85ad5bbff251c45aa/bcrypt-5.0.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:48f753100931605686f74e27a7b49238122aa761a9aefe9373265b8b7aa43ea4", size = 268544, upload-time = "2025-09-25T19:49:30.39Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f5/91/50ccba088b8c474545b034a1424d05195d9fcbaaf802ab8bfe2be5a4e0d7/bcrypt-5.0.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:f70aadb7a809305226daedf75d90379c397b094755a710d7014b8b117df1ebbf", size = 271787, upload-time = "2025-09-25T19:49:32.144Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/aa/e7/d7dba133e02abcda3b52087a7eea8c0d4f64d3e593b4fffc10c31b7061f3/bcrypt-5.0.0-cp314-cp314t-manylinux_2_28_aarch64.whl", hash = "sha256:744d3c6b164caa658adcb72cb8cc9ad9b4b75c7db507ab4bc2480474a51989da", size = 269753, upload-time = "2025-09-25T19:49:33.885Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/33/fc/5b145673c4b8d01018307b5c2c1fc87a6f5a436f0ad56607aee389de8ee3/bcrypt-5.0.0-cp314-cp314t-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:a28bc05039bdf3289d757f49d616ab3efe8cf40d8e8001ccdd621cd4f98f4fc9", size = 289587, upload-time = "2025-09-25T19:49:35.144Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/27/d7/1ff22703ec6d4f90e62f1a5654b8867ef96bafb8e8102c2288333e1a6ca6/bcrypt-5.0.0-cp314-cp314t-manylinux_2_28_x86_64.whl", hash = "sha256:7f277a4b3390ab4bebe597800a90da0edae882c6196d3038a73adf446c4f969f", size = 272178, upload-time = "2025-09-25T19:49:36.793Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c8/88/815b6d558a1e4d40ece04a2f84865b0fef233513bd85fd0e40c294272d62/bcrypt-5.0.0-cp314-cp314t-manylinux_2_34_aarch64.whl", hash = "sha256:79cfa161eda8d2ddf29acad370356b47f02387153b11d46042e93a0a95127493", size = 269295, upload-time = "2025-09-25T19:49:38.164Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/51/8c/e0db387c79ab4931fc89827d37608c31cc57b6edc08ccd2386139028dc0d/bcrypt-5.0.0-cp314-cp314t-manylinux_2_34_x86_64.whl", hash = "sha256:a5393eae5722bcef046a990b84dff02b954904c36a194f6cfc817d7dca6c6f0b", size = 271700, upload-time = "2025-09-25T19:49:39.917Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/06/83/1570edddd150f572dbe9fc00f6203a89fc7d4226821f67328a85c330f239/bcrypt-5.0.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:7f4c94dec1b5ab5d522750cb059bb9409ea8872d4494fd152b53cca99f1ddd8c", size = 334034, upload-time = "2025-09-25T19:49:41.227Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c9/f2/ea64e51a65e56ae7a8a4ec236c2bfbdd4b23008abd50ac33fbb2d1d15424/bcrypt-5.0.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:0cae4cb350934dfd74c020525eeae0a5f79257e8a201c0c176f4b84fdbf2a4b4", size = 352766, upload-time = "2025-09-25T19:49:43.08Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d7/d4/1a388d21ee66876f27d1a1f41287897d0c0f1712ef97d395d708ba93004c/bcrypt-5.0.0-cp314-cp314t-win32.whl", hash = "sha256:b17366316c654e1ad0306a6858e189fc835eca39f7eb2cafd6aaca8ce0c40a2e", size = 152449, upload-time = "2025-09-25T19:49:44.971Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/3f/61/3291c2243ae0229e5bca5d19f4032cecad5dfb05a2557169d3a69dc0ba91/bcrypt-5.0.0-cp314-cp314t-win_amd64.whl", hash = "sha256:92864f54fb48b4c718fc92a32825d0e42265a627f956bc0361fe869f1adc3e7d", size = 149310, upload-time = "2025-09-25T19:49:46.162Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/3e/89/4b01c52ae0c1a681d4021e5dd3e45b111a8fb47254a274fa9a378d8d834b/bcrypt-5.0.0-cp314-cp314t-win_arm64.whl", hash = "sha256:dd19cf5184a90c873009244586396a6a884d591a5323f0e8a5922560718d4993", size = 143761, upload-time = "2025-09-25T19:49:47.345Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/84/29/6237f151fbfe295fe3e074ecc6d44228faa1e842a81f6d34a02937ee1736/bcrypt-5.0.0-cp38-abi3-macosx_10_12_universal2.whl", hash = "sha256:fc746432b951e92b58317af8e0ca746efe93e66555f1b40888865ef5bf56446b", size = 494553, upload-time = "2025-09-25T19:49:49.006Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/45/b6/4c1205dde5e464ea3bd88e8742e19f899c16fa8916fb8510a851fae985b5/bcrypt-5.0.0-cp38-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:c2388ca94ffee269b6038d48747f4ce8df0ffbea43f31abfa18ac72f0218effb", size = 275009, upload-time = "2025-09-25T19:49:50.581Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/3b/71/427945e6ead72ccffe77894b2655b695ccf14ae1866cd977e185d606dd2f/bcrypt-5.0.0-cp38-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:560ddb6ec730386e7b3b26b8b4c88197aaed924430e7b74666a586ac997249ef", size = 278029, upload-time = "2025-09-25T19:49:52.533Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/17/72/c344825e3b83c5389a369c8a8e58ffe1480b8a699f46c127c34580c4666b/bcrypt-5.0.0-cp38-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:d79e5c65dcc9af213594d6f7f1fa2c98ad3fc10431e7aa53c176b441943efbdd", size = 275907, upload-time = "2025-09-25T19:49:54.709Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/0b/7e/d4e47d2df1641a36d1212e5c0514f5291e1a956a7749f1e595c07a972038/bcrypt-5.0.0-cp38-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:2b732e7d388fa22d48920baa267ba5d97cca38070b69c0e2d37087b381c681fd", size = 296500, upload-time = "2025-09-25T19:49:56.013Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/0f/c3/0ae57a68be2039287ec28bc463b82e4b8dc23f9d12c0be331f4782e19108/bcrypt-5.0.0-cp38-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:0c8e093ea2532601a6f686edbc2c6b2ec24131ff5c52f7610dd64fa4553b5464", size = 278412, upload-time = "2025-09-25T19:49:57.356Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/45/2b/77424511adb11e6a99e3a00dcc7745034bee89036ad7d7e255a7e47be7d8/bcrypt-5.0.0-cp38-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:5b1589f4839a0899c146e8892efe320c0fa096568abd9b95593efac50a87cb75", size = 275486, upload-time = "2025-09-25T19:49:59.116Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/43/0a/405c753f6158e0f3f14b00b462d8bca31296f7ecfc8fc8bc7919c0c7d73a/bcrypt-5.0.0-cp38-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:89042e61b5e808b67daf24a434d89bab164d4de1746b37a8d173b6b14f3db9ff", size = 277940, upload-time = "2025-09-25T19:50:00.869Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/62/83/b3efc285d4aadc1fa83db385ec64dcfa1707e890eb42f03b127d66ac1b7b/bcrypt-5.0.0-cp38-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:e3cf5b2560c7b5a142286f69bde914494b6d8f901aaa71e453078388a50881c4", size = 310776, upload-time = "2025-09-25T19:50:02.393Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/95/7d/47ee337dacecde6d234890fe929936cb03ebc4c3a7460854bbd9c97780b8/bcrypt-5.0.0-cp38-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:f632fd56fc4e61564f78b46a2269153122db34988e78b6be8b32d28507b7eaeb", size = 312922, upload-time = "2025-09-25T19:50:04.232Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d6/3a/43d494dfb728f55f4e1cf8fd435d50c16a2d75493225b54c8d06122523c6/bcrypt-5.0.0-cp38-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:801cad5ccb6b87d1b430f183269b94c24f248dddbbc5c1f78b6ed231743e001c", size = 341367, upload-time = "2025-09-25T19:50:05.559Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/55/ab/a0727a4547e383e2e22a630e0f908113db37904f58719dc48d4622139b5c/bcrypt-5.0.0-cp38-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:3cf67a804fc66fc217e6914a5635000259fbbbb12e78a99488e4d5ba445a71eb", size = 359187, upload-time = "2025-09-25T19:50:06.916Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/1b/bb/461f352fdca663524b4643d8b09e8435b4990f17fbf4fea6bc2a90aa0cc7/bcrypt-5.0.0-cp38-abi3-win32.whl", hash = "sha256:3abeb543874b2c0524ff40c57a4e14e5d3a66ff33fb423529c88f180fd756538", size = 153752, upload-time = "2025-09-25T19:50:08.515Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/41/aa/4190e60921927b7056820291f56fc57d00d04757c8b316b2d3c0d1d6da2c/bcrypt-5.0.0-cp38-abi3-win_amd64.whl", hash = "sha256:35a77ec55b541e5e583eb3436ffbbf53b0ffa1fa16ca6782279daf95d146dcd9", size = 150881, upload-time = "2025-09-25T19:50:09.742Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/54/12/cd77221719d0b39ac0b55dbd39358db1cd1246e0282e104366ebbfb8266a/bcrypt-5.0.0-cp38-abi3-win_arm64.whl", hash = "sha256:cde08734f12c6a4e28dc6755cd11d3bdfea608d93d958fffbe95a7026ebe4980", size = 144931, upload-time = "2025-09-25T19:50:11.016Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/5d/ba/2af136406e1c3839aea9ecadc2f6be2bcd1eff255bd451dd39bcf302c47a/bcrypt-5.0.0-cp39-abi3-macosx_10_12_universal2.whl", hash = "sha256:0c418ca99fd47e9c59a301744d63328f17798b5947b0f791e9af3c1c499c2d0a", size = 495313, upload-time = "2025-09-25T19:50:12.309Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ac/ee/2f4985dbad090ace5ad1f7dd8ff94477fe089b5fab2040bd784a3d5f187b/bcrypt-5.0.0-cp39-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:ddb4e1500f6efdd402218ffe34d040a1196c072e07929b9820f363a1fd1f4191", size = 275290, upload-time = "2025-09-25T19:50:13.673Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e4/6e/b77ade812672d15cf50842e167eead80ac3514f3beacac8902915417f8b7/bcrypt-5.0.0-cp39-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:7aeef54b60ceddb6f30ee3db090351ecf0d40ec6e2abf41430997407a46d2254", size = 278253, upload-time = "2025-09-25T19:50:15.089Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/36/c4/ed00ed32f1040f7990dac7115f82273e3c03da1e1a1587a778d8cea496d8/bcrypt-5.0.0-cp39-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:f0ce778135f60799d89c9693b9b398819d15f1921ba15fe719acb3178215a7db", size = 276084, upload-time = "2025-09-25T19:50:16.699Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e7/c4/fa6e16145e145e87f1fa351bbd54b429354fd72145cd3d4e0c5157cf4c70/bcrypt-5.0.0-cp39-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:a71f70ee269671460b37a449f5ff26982a6f2ba493b3eabdd687b4bf35f875ac", size = 297185, upload-time = "2025-09-25T19:50:18.525Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/24/b4/11f8a31d8b67cca3371e046db49baa7c0594d71eb40ac8121e2fc0888db0/bcrypt-5.0.0-cp39-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:f8429e1c410b4073944f03bd778a9e066e7fad723564a52ff91841d278dfc822", size = 278656, upload-time = "2025-09-25T19:50:19.809Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ac/31/79f11865f8078e192847d2cb526e3fa27c200933c982c5b2869720fa5fce/bcrypt-5.0.0-cp39-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:edfcdcedd0d0f05850c52ba3127b1fce70b9f89e0fe5ff16517df7e81fa3cbb8", size = 275662, upload-time = "2025-09-25T19:50:21.567Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d4/8d/5e43d9584b3b3591a6f9b68f755a4da879a59712981ef5ad2a0ac1379f7a/bcrypt-5.0.0-cp39-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:611f0a17aa4a25a69362dcc299fda5c8a3d4f160e2abb3831041feb77393a14a", size = 278240, upload-time = "2025-09-25T19:50:23.305Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/89/48/44590e3fc158620f680a978aafe8f87a4c4320da81ed11552f0323aa9a57/bcrypt-5.0.0-cp39-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:db99dca3b1fdc3db87d7c57eac0c82281242d1eabf19dcb8a6b10eb29a2e72d1", size = 311152, upload-time = "2025-09-25T19:50:24.597Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/5f/85/e4fbfc46f14f47b0d20493669a625da5827d07e8a88ee460af6cd9768b44/bcrypt-5.0.0-cp39-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:5feebf85a9cefda32966d8171f5db7e3ba964b77fdfe31919622256f80f9cf42", size = 313284, upload-time = "2025-09-25T19:50:26.268Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/25/ae/479f81d3f4594456a01ea2f05b132a519eff9ab5768a70430fa1132384b1/bcrypt-5.0.0-cp39-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:3ca8a166b1140436e058298a34d88032ab62f15aae1c598580333dc21d27ef10", size = 341643, upload-time = "2025-09-25T19:50:28.02Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/df/d2/36a086dee1473b14276cd6ea7f61aef3b2648710b5d7f1c9e032c29b859f/bcrypt-5.0.0-cp39-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:61afc381250c3182d9078551e3ac3a41da14154fbff647ddf52a769f588c4172", size = 359698, upload-time = "2025-09-25T19:50:31.347Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c0/f6/688d2cd64bfd0b14d805ddb8a565e11ca1fb0fd6817175d58b10052b6d88/bcrypt-5.0.0-cp39-abi3-win32.whl", hash = "sha256:64d7ce196203e468c457c37ec22390f1a61c85c6f0b8160fd752940ccfb3a683", size = 153725, upload-time = "2025-09-25T19:50:34.384Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/9f/b9/9d9a641194a730bda138b3dfe53f584d61c58cd5230e37566e83ec2ffa0d/bcrypt-5.0.0-cp39-abi3-win_amd64.whl", hash = "sha256:64ee8434b0da054d830fa8e89e1c8bf30061d539044a39524ff7dec90481e5c2", size = 150912, upload-time = "2025-09-25T19:50:35.69Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/27/44/d2ef5e87509158ad2187f4dd0852df80695bb1ee0cfe0a684727b01a69e0/bcrypt-5.0.0-cp39-abi3-win_arm64.whl", hash = "sha256:f2347d3534e76bf50bca5500989d6c1d05ed64b440408057a37673282c654927", size = 144953, upload-time = "2025-09-25T19:50:37.32Z" },
|
||||||
|
]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "certifi"
|
name = "certifi"
|
||||||
version = "2025.8.3"
|
version = "2025.8.3"
|
||||||
|
|||||||
Reference in New Issue
Block a user