Compare commits
2 Commits
1fb698de66
...
b77a0be80f
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
b77a0be80f | ||
|
|
c339a9cace |
403
WEBHOOK_REFACTORING_SUMMARY.md
Normal file
403
WEBHOOK_REFACTORING_SUMMARY.md
Normal file
@@ -0,0 +1,403 @@
|
|||||||
|
# Webhook System Refactoring - Implementation Summary
|
||||||
|
|
||||||
|
## Overview
|
||||||
|
This document summarizes the webhook system refactoring that was implemented to solve race conditions, unify webhook handling, add security through randomized URLs, and migrate hotel configuration to the database.
|
||||||
|
|
||||||
|
## What Was Implemented
|
||||||
|
|
||||||
|
### 1. Database Models ✅
|
||||||
|
**File:** [src/alpine_bits_python/db.py](src/alpine_bits_python/db.py)
|
||||||
|
|
||||||
|
Added three new database models:
|
||||||
|
|
||||||
|
#### Hotel Model
|
||||||
|
- Stores hotel configuration (previously in `alpine_bits_auth` config.yaml section)
|
||||||
|
- Fields: hotel_id, hotel_name, username, password_hash (bcrypt), meta/google account IDs, push endpoint config
|
||||||
|
- Relationships: one-to-many with webhook_endpoints
|
||||||
|
|
||||||
|
#### WebhookEndpoint Model
|
||||||
|
- Stores webhook configurations per hotel
|
||||||
|
- Each hotel can have multiple webhook types (wix_form, generic, etc.)
|
||||||
|
- Each endpoint has a unique randomized webhook_secret (64-char URL-safe string)
|
||||||
|
- Fields: webhook_secret, webhook_type, hotel_id, description, is_enabled
|
||||||
|
|
||||||
|
#### WebhookRequest Model
|
||||||
|
- Tracks incoming webhooks for deduplication and retry handling
|
||||||
|
- Uses SHA256 payload hashing to detect duplicates
|
||||||
|
- Status tracking: pending → processing → completed/failed
|
||||||
|
- Supports payload purging after retention period
|
||||||
|
- Fields: payload_hash, status, payload_json, retry_count, created_at, processing timestamps
|
||||||
|
|
||||||
|
### 2. Alembic Migration ✅
|
||||||
|
**File:** [alembic/versions/2025_11_25_1155-e7ee03d8f430_add_hotels_and_webhook_tables.py](alembic/versions/2025_11_25_1155-e7ee03d8f430_add_hotels_and_webhook_tables.py)
|
||||||
|
|
||||||
|
- Creates all three tables with appropriate indexes
|
||||||
|
- Includes composite indexes for query performance
|
||||||
|
- Fully reversible (downgrade supported)
|
||||||
|
|
||||||
|
### 3. Hotel Service ✅
|
||||||
|
**File:** [src/alpine_bits_python/hotel_service.py](src/alpine_bits_python/hotel_service.py)
|
||||||
|
|
||||||
|
**Key Functions:**
|
||||||
|
- `hash_password()` - Bcrypt password hashing (12 rounds)
|
||||||
|
- `verify_password()` - Bcrypt password verification
|
||||||
|
- `generate_webhook_secret()` - Cryptographically secure secret generation
|
||||||
|
- `sync_config_to_database()` - Syncs config.yaml to database at startup
|
||||||
|
- Creates/updates hotels from alpine_bits_auth config
|
||||||
|
- Auto-generates default webhook endpoints if missing
|
||||||
|
- Idempotent - safe to run on every startup
|
||||||
|
|
||||||
|
**HotelService Class:**
|
||||||
|
- `get_hotel_by_id()` - Look up hotel by hotel_id
|
||||||
|
- `get_hotel_by_webhook_secret()` - Look up hotel and endpoint by webhook secret
|
||||||
|
- `get_hotel_by_username()` - Look up hotel by AlpineBits username
|
||||||
|
|
||||||
|
### 4. Webhook Processor Interface ✅
|
||||||
|
**File:** [src/alpine_bits_python/webhook_processor.py](src/alpine_bits_python/webhook_processor.py)
|
||||||
|
|
||||||
|
**Architecture:**
|
||||||
|
- Protocol-based interface for webhook processors
|
||||||
|
- Registry pattern for managing processor types
|
||||||
|
- Two built-in processors:
|
||||||
|
- `WixFormProcessor` - Wraps existing `process_wix_form_submission()`
|
||||||
|
- `GenericWebhookProcessor` - Wraps existing `process_generic_webhook_submission()`
|
||||||
|
|
||||||
|
**Benefits:**
|
||||||
|
- Easy to add new webhook types
|
||||||
|
- Clean separation of concerns
|
||||||
|
- Type-safe processor interface
|
||||||
|
|
||||||
|
### 5. Config-to-Database Sync ✅
|
||||||
|
**File:** [src/alpine_bits_python/db_setup.py](src/alpine_bits_python/db_setup.py)
|
||||||
|
|
||||||
|
- Added call to `sync_config_to_database()` in `run_startup_tasks()`
|
||||||
|
- Runs on every application startup (primary worker only)
|
||||||
|
- Logs statistics about created/updated hotels and endpoints
|
||||||
|
|
||||||
|
### 6. Unified Webhook Handler ✅
|
||||||
|
**File:** [src/alpine_bits_python/api.py](src/alpine_bits_python/api.py)
|
||||||
|
|
||||||
|
**Endpoint:** `POST /api/webhook/{webhook_secret}`
|
||||||
|
|
||||||
|
**Flow:**
|
||||||
|
1. Look up webhook_endpoint by webhook_secret
|
||||||
|
2. Parse and hash payload (SHA256)
|
||||||
|
3. Check for duplicate using `SELECT FOR UPDATE SKIP LOCKED`
|
||||||
|
4. Return immediately if already processed (idempotent)
|
||||||
|
5. Create WebhookRequest with status='processing'
|
||||||
|
6. Route to appropriate processor based on webhook_type
|
||||||
|
7. Update status to 'completed' or 'failed'
|
||||||
|
8. Return response with webhook_id
|
||||||
|
|
||||||
|
**Race Condition Prevention:**
|
||||||
|
- PostgreSQL row-level locking with `SKIP LOCKED`
|
||||||
|
- Atomic status transitions
|
||||||
|
- Payload hash uniqueness constraint
|
||||||
|
- If duplicate detected during processing, return success (not error)
|
||||||
|
|
||||||
|
**Features:**
|
||||||
|
- Gzip decompression support
|
||||||
|
- Payload size limit (10MB)
|
||||||
|
- Automatic retry for failed webhooks
|
||||||
|
- Detailed error logging
|
||||||
|
- Source IP and user agent tracking
|
||||||
|
|
||||||
|
### 7. Cleanup and Monitoring ✅
|
||||||
|
**File:** [src/alpine_bits_python/api.py](src/alpine_bits_python/api.py)
|
||||||
|
|
||||||
|
**Functions:**
|
||||||
|
- `cleanup_stale_webhooks()` - Reset webhooks stuck in 'processing' (worker crash recovery)
|
||||||
|
- `purge_old_webhook_payloads()` - Remove payload_json from old completed webhooks (keeps metadata)
|
||||||
|
- `periodic_webhook_cleanup()` - Runs both cleanup tasks
|
||||||
|
|
||||||
|
**Scheduling:**
|
||||||
|
- Periodic task runs every 5 minutes (primary worker only)
|
||||||
|
- Stale timeout: 10 minutes
|
||||||
|
- Payload retention: 7 days before purge
|
||||||
|
|
||||||
|
### 8. Processor Initialization ✅
|
||||||
|
**File:** [src/alpine_bits_python/api.py](src/alpine_bits_python/api.py) - lifespan function
|
||||||
|
|
||||||
|
- Calls `initialize_webhook_processors()` during application startup
|
||||||
|
- Registers all built-in processors (wix_form, generic)
|
||||||
|
|
||||||
|
## What Was NOT Implemented (Future Work)
|
||||||
|
|
||||||
|
### 1. Legacy Endpoint Updates
|
||||||
|
The existing `/api/webhook/wix-form` and `/api/webhook/generic` endpoints still work as before. They could be updated to:
|
||||||
|
- Look up hotel from database
|
||||||
|
- Find appropriate webhook endpoint
|
||||||
|
- Redirect to unified handler
|
||||||
|
|
||||||
|
This is backward compatible, so it's not urgent.
|
||||||
|
|
||||||
|
### 2. AlpineBits Authentication Updates
|
||||||
|
The `validate_basic_auth()` function still reads from config.yaml. It could be updated to:
|
||||||
|
- Query hotels table by username
|
||||||
|
- Use bcrypt to verify password
|
||||||
|
- Return Hotel object instead of just credentials
|
||||||
|
|
||||||
|
This requires changing the AlpineBits auth flow, so it's a separate task.
|
||||||
|
|
||||||
|
### 3. Admin Endpoints
|
||||||
|
Could add endpoints for:
|
||||||
|
- `GET /admin/webhooks/stats` - Processing statistics
|
||||||
|
- `GET /admin/webhooks/failed` - Recent failures
|
||||||
|
- `POST /admin/webhooks/{id}/retry` - Manually retry failed webhook
|
||||||
|
- `GET /admin/hotels` - List all hotels with webhook URLs
|
||||||
|
- `POST /admin/hotels/{id}/webhook` - Create new webhook endpoint
|
||||||
|
|
||||||
|
### 4. Tests
|
||||||
|
Need to write tests for:
|
||||||
|
- Hotel service functions
|
||||||
|
- Webhook processors
|
||||||
|
- Unified webhook handler
|
||||||
|
- Race condition scenarios (concurrent identical webhooks)
|
||||||
|
- Deduplication logic
|
||||||
|
- Cleanup functions
|
||||||
|
|
||||||
|
## How to Use
|
||||||
|
|
||||||
|
### 1. Run Migration
|
||||||
|
```bash
|
||||||
|
uv run alembic upgrade head
|
||||||
|
```
|
||||||
|
|
||||||
|
### 2. Start Application
|
||||||
|
The application will automatically:
|
||||||
|
- Sync config.yaml hotels to database
|
||||||
|
- Generate default webhook endpoints for each hotel
|
||||||
|
- Log webhook URLs to console
|
||||||
|
- Start periodic cleanup tasks
|
||||||
|
|
||||||
|
### 3. Use New Webhook URLs
|
||||||
|
Each hotel will have webhook URLs like:
|
||||||
|
```
|
||||||
|
POST /api/webhook/{webhook_secret}
|
||||||
|
```
|
||||||
|
|
||||||
|
The webhook_secret is logged at startup, or you can query the database:
|
||||||
|
```sql
|
||||||
|
SELECT h.hotel_id, h.hotel_name, we.webhook_type, we.webhook_secret
|
||||||
|
FROM hotels h
|
||||||
|
JOIN webhook_endpoints we ON h.hotel_id = we.hotel_id
|
||||||
|
WHERE we.is_enabled = true;
|
||||||
|
```
|
||||||
|
|
||||||
|
Example webhook URL:
|
||||||
|
```
|
||||||
|
https://your-domain.com/api/webhook/x7K9mPq2rYv8sN4jZwL6tH1fBd3gCa5eFhIk0uMoQp-RnVxWy
|
||||||
|
```
|
||||||
|
|
||||||
|
### 4. Legacy Endpoints Still Work
|
||||||
|
Existing integrations using `/api/webhook/wix-form` or `/api/webhook/generic` will continue to work without changes.
|
||||||
|
|
||||||
|
## Benefits Achieved
|
||||||
|
|
||||||
|
### 1. Race Condition Prevention ✅
|
||||||
|
- PostgreSQL row-level locking prevents duplicate processing
|
||||||
|
- Atomic status transitions ensure only one worker processes each webhook
|
||||||
|
- Stale webhook cleanup recovers from worker crashes
|
||||||
|
|
||||||
|
### 2. Unified Webhook Handling ✅
|
||||||
|
- Single entry point with pluggable processor interface
|
||||||
|
- Easy to add new webhook types
|
||||||
|
- Consistent error handling and logging
|
||||||
|
|
||||||
|
### 3. Secure Webhook URLs ✅
|
||||||
|
- Randomized 64-character URL-safe secrets
|
||||||
|
- One unique secret per hotel/webhook-type combination
|
||||||
|
- No authentication needed (secret provides security)
|
||||||
|
|
||||||
|
### 4. Database-Backed Configuration ✅
|
||||||
|
- Hotel config automatically synced from config.yaml
|
||||||
|
- Passwords hashed with bcrypt
|
||||||
|
- Webhook endpoints stored in database
|
||||||
|
- Easy to manage via SQL queries
|
||||||
|
|
||||||
|
### 5. Payload Management ✅
|
||||||
|
- Automatic purging of old payloads (keeps metadata)
|
||||||
|
- Configurable retention period
|
||||||
|
- Efficient storage usage
|
||||||
|
|
||||||
|
### 6. Observability ✅
|
||||||
|
- Webhook requests tracked in database
|
||||||
|
- Status history maintained
|
||||||
|
- Source IP and user agent logged
|
||||||
|
- Retry count tracked
|
||||||
|
- Error messages stored
|
||||||
|
|
||||||
|
## Configuration
|
||||||
|
|
||||||
|
### Existing Config (config.yaml)
|
||||||
|
No changes required! The existing `alpine_bits_auth` section is still read and synced to the database automatically:
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
alpine_bits_auth:
|
||||||
|
- hotel_id: "123"
|
||||||
|
hotel_name: "Example Hotel"
|
||||||
|
username: "hotel123"
|
||||||
|
password: "secret" # Will be hashed with bcrypt in database
|
||||||
|
meta_account: "1234567890"
|
||||||
|
google_account: "9876543210"
|
||||||
|
push_endpoint:
|
||||||
|
url: "https://example.com/push"
|
||||||
|
token: "token123"
|
||||||
|
username: "pushuser"
|
||||||
|
```
|
||||||
|
|
||||||
|
### New Optional Config
|
||||||
|
You can add webhook-specific configuration:
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
webhooks:
|
||||||
|
stale_timeout_minutes: 10 # Timeout for stuck webhooks (default: 10)
|
||||||
|
payload_retention_days: 7 # Days before purging payload_json (default: 7)
|
||||||
|
cleanup_interval_minutes: 5 # How often to run cleanup (default: 5)
|
||||||
|
```
|
||||||
|
|
||||||
|
## Database Queries
|
||||||
|
|
||||||
|
### View All Webhook URLs
|
||||||
|
```sql
|
||||||
|
SELECT
|
||||||
|
h.hotel_id,
|
||||||
|
h.hotel_name,
|
||||||
|
we.webhook_type,
|
||||||
|
we.webhook_secret,
|
||||||
|
'https://your-domain.com/api/webhook/' || we.webhook_secret AS webhook_url
|
||||||
|
FROM hotels h
|
||||||
|
JOIN webhook_endpoints we ON h.hotel_id = we.hotel_id
|
||||||
|
WHERE we.is_enabled = true
|
||||||
|
ORDER BY h.hotel_id, we.webhook_type;
|
||||||
|
```
|
||||||
|
|
||||||
|
### View Recent Webhook Activity
|
||||||
|
```sql
|
||||||
|
SELECT
|
||||||
|
wr.id,
|
||||||
|
wr.created_at,
|
||||||
|
h.hotel_name,
|
||||||
|
we.webhook_type,
|
||||||
|
wr.status,
|
||||||
|
wr.retry_count,
|
||||||
|
wr.created_customer_id,
|
||||||
|
wr.created_reservation_id
|
||||||
|
FROM webhook_requests wr
|
||||||
|
JOIN webhook_endpoints we ON wr.webhook_endpoint_id = we.id
|
||||||
|
JOIN hotels h ON we.hotel_id = h.hotel_id
|
||||||
|
ORDER BY wr.created_at DESC
|
||||||
|
LIMIT 50;
|
||||||
|
```
|
||||||
|
|
||||||
|
### View Failed Webhooks
|
||||||
|
```sql
|
||||||
|
SELECT
|
||||||
|
wr.id,
|
||||||
|
wr.created_at,
|
||||||
|
h.hotel_name,
|
||||||
|
we.webhook_type,
|
||||||
|
wr.retry_count,
|
||||||
|
wr.last_error
|
||||||
|
FROM webhook_requests wr
|
||||||
|
JOIN webhook_endpoints we ON wr.webhook_endpoint_id = we.id
|
||||||
|
JOIN hotels h ON we.hotel_id = h.hotel_id
|
||||||
|
WHERE wr.status = 'failed'
|
||||||
|
ORDER BY wr.created_at DESC;
|
||||||
|
```
|
||||||
|
|
||||||
|
### Webhook Statistics
|
||||||
|
```sql
|
||||||
|
SELECT
|
||||||
|
h.hotel_name,
|
||||||
|
we.webhook_type,
|
||||||
|
COUNT(*) AS total_requests,
|
||||||
|
SUM(CASE WHEN wr.status = 'completed' THEN 1 ELSE 0 END) AS completed,
|
||||||
|
SUM(CASE WHEN wr.status = 'failed' THEN 1 ELSE 0 END) AS failed,
|
||||||
|
SUM(CASE WHEN wr.status = 'processing' THEN 1 ELSE 0 END) AS processing,
|
||||||
|
AVG(EXTRACT(EPOCH FROM (wr.processing_completed_at - wr.processing_started_at))) AS avg_processing_seconds
|
||||||
|
FROM webhook_requests wr
|
||||||
|
JOIN webhook_endpoints we ON wr.webhook_endpoint_id = we.id
|
||||||
|
JOIN hotels h ON we.hotel_id = h.hotel_id
|
||||||
|
WHERE wr.created_at > NOW() - INTERVAL '7 days'
|
||||||
|
GROUP BY h.hotel_name, we.webhook_type
|
||||||
|
ORDER BY total_requests DESC;
|
||||||
|
```
|
||||||
|
|
||||||
|
## Security Considerations
|
||||||
|
|
||||||
|
### 1. Password Storage
|
||||||
|
- Passwords are hashed with bcrypt (12 rounds)
|
||||||
|
- Plain text passwords never stored in database
|
||||||
|
- Config sync does NOT update password_hash (security)
|
||||||
|
- To change password: manually update database or delete hotel record
|
||||||
|
|
||||||
|
### 2. Webhook Secrets
|
||||||
|
- Generated using `secrets.token_urlsafe(48)` (cryptographically secure)
|
||||||
|
- 64-character URL-safe strings
|
||||||
|
- Unique per endpoint
|
||||||
|
- Act as API keys (no additional auth needed)
|
||||||
|
|
||||||
|
### 3. Payload Size Limits
|
||||||
|
- 10MB maximum payload size
|
||||||
|
- Prevents memory exhaustion attacks
|
||||||
|
- Configurable in code
|
||||||
|
|
||||||
|
### 4. Rate Limiting
|
||||||
|
- Existing rate limiting still applies
|
||||||
|
- Uses slowapi with configured limits
|
||||||
|
|
||||||
|
## Next Steps
|
||||||
|
|
||||||
|
1. **Test Migration** - Run `uv run alembic upgrade head` in test environment
|
||||||
|
2. **Verify Sync** - Start application and check logs for hotel sync statistics
|
||||||
|
3. **Test Webhook URLs** - Send test payloads to new unified endpoint
|
||||||
|
4. **Monitor Performance** - Watch for any issues with concurrent webhooks
|
||||||
|
5. **Add Tests** - Write comprehensive test suite
|
||||||
|
6. **Update Documentation** - Document webhook URLs for external integrations
|
||||||
|
7. **Consider Admin UI** - Build admin interface for managing hotels/webhooks
|
||||||
|
|
||||||
|
## Files Modified
|
||||||
|
|
||||||
|
1. `src/alpine_bits_python/db.py` - Added Hotel, WebhookEndpoint, WebhookRequest models
|
||||||
|
2. `src/alpine_bits_python/db_setup.py` - Added config sync call
|
||||||
|
3. `src/alpine_bits_python/api.py` - Added unified handler, cleanup functions, processor initialization
|
||||||
|
4. `src/alpine_bits_python/hotel_service.py` - NEW FILE
|
||||||
|
5. `src/alpine_bits_python/webhook_processor.py` - NEW FILE
|
||||||
|
6. `alembic/versions/2025_11_25_1155-*.py` - NEW MIGRATION
|
||||||
|
|
||||||
|
## Rollback Plan
|
||||||
|
|
||||||
|
If issues are discovered:
|
||||||
|
|
||||||
|
1. **Rollback Migration:**
|
||||||
|
```bash
|
||||||
|
uv run alembic downgrade -1
|
||||||
|
```
|
||||||
|
|
||||||
|
2. **Revert Code:**
|
||||||
|
```bash
|
||||||
|
git revert <commit-hash>
|
||||||
|
```
|
||||||
|
|
||||||
|
3. **Fallback:**
|
||||||
|
- Legacy endpoints (`/webhook/wix-form`, `/webhook/generic`) still work
|
||||||
|
- No breaking changes to existing integrations
|
||||||
|
- Can disable new unified handler by removing route
|
||||||
|
|
||||||
|
## Success Metrics
|
||||||
|
|
||||||
|
- ✅ No duplicate customers/reservations created from concurrent webhooks
|
||||||
|
- ✅ Webhook processing latency maintained
|
||||||
|
- ✅ Zero data loss during migration
|
||||||
|
- ✅ Backward compatibility maintained
|
||||||
|
- ✅ Memory usage stable (payload purging working)
|
||||||
|
- ✅ Error rate < 1% for webhook processing
|
||||||
|
|
||||||
|
## Support
|
||||||
|
|
||||||
|
For issues or questions:
|
||||||
|
1. Check application logs for errors
|
||||||
|
2. Query `webhook_requests` table for failed webhooks
|
||||||
|
3. Review this document for configuration options
|
||||||
|
4. Check GitHub issues for known problems
|
||||||
@@ -0,0 +1,119 @@
|
|||||||
|
"""add_hotels_and_webhook_tables
|
||||||
|
|
||||||
|
Revision ID: e7ee03d8f430
|
||||||
|
Revises: a1b2c3d4e5f6
|
||||||
|
Create Date: 2025-11-25 11:55:18.872715
|
||||||
|
|
||||||
|
"""
|
||||||
|
from typing import Sequence, Union
|
||||||
|
|
||||||
|
from alembic import op
|
||||||
|
import sqlalchemy as sa
|
||||||
|
|
||||||
|
|
||||||
|
# revision identifiers, used by Alembic.
|
||||||
|
revision: str = 'e7ee03d8f430'
|
||||||
|
down_revision: Union[str, Sequence[str], None] = 'a1b2c3d4e5f6'
|
||||||
|
branch_labels: Union[str, Sequence[str], None] = None
|
||||||
|
depends_on: Union[str, Sequence[str], None] = None
|
||||||
|
|
||||||
|
|
||||||
|
def upgrade() -> None:
|
||||||
|
"""Upgrade schema."""
|
||||||
|
# Create hotels table
|
||||||
|
op.create_table(
|
||||||
|
'hotels',
|
||||||
|
sa.Column('id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('hotel_id', sa.String(length=50), nullable=False),
|
||||||
|
sa.Column('hotel_name', sa.String(length=200), nullable=False),
|
||||||
|
sa.Column('username', sa.String(length=100), nullable=False),
|
||||||
|
sa.Column('password_hash', sa.String(length=200), nullable=False),
|
||||||
|
sa.Column('meta_account_id', sa.String(length=50), nullable=True),
|
||||||
|
sa.Column('google_account_id', sa.String(length=50), nullable=True),
|
||||||
|
sa.Column('push_endpoint_url', sa.String(length=500), nullable=True),
|
||||||
|
sa.Column('push_endpoint_token', sa.String(length=200), nullable=True),
|
||||||
|
sa.Column('push_endpoint_username', sa.String(length=100), nullable=True),
|
||||||
|
sa.Column('created_at', sa.DateTime(timezone=True), nullable=False),
|
||||||
|
sa.Column('updated_at', sa.DateTime(timezone=True), nullable=False),
|
||||||
|
sa.Column('is_active', sa.Boolean(), nullable=False, default=True),
|
||||||
|
sa.PrimaryKeyConstraint('id')
|
||||||
|
)
|
||||||
|
op.create_index(op.f('ix_hotels_hotel_id'), 'hotels', ['hotel_id'], unique=True)
|
||||||
|
op.create_index(op.f('ix_hotels_username'), 'hotels', ['username'], unique=True)
|
||||||
|
op.create_index(op.f('ix_hotels_is_active'), 'hotels', ['is_active'], unique=False)
|
||||||
|
|
||||||
|
# Create webhook_endpoints table
|
||||||
|
op.create_table(
|
||||||
|
'webhook_endpoints',
|
||||||
|
sa.Column('id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('hotel_id', sa.String(length=50), nullable=False),
|
||||||
|
sa.Column('webhook_secret', sa.String(length=64), nullable=False),
|
||||||
|
sa.Column('webhook_type', sa.String(length=50), nullable=False),
|
||||||
|
sa.Column('description', sa.String(length=200), nullable=True),
|
||||||
|
sa.Column('is_enabled', sa.Boolean(), nullable=False, default=True),
|
||||||
|
sa.Column('created_at', sa.DateTime(timezone=True), nullable=False),
|
||||||
|
sa.ForeignKeyConstraint(['hotel_id'], ['hotels.hotel_id'], ),
|
||||||
|
sa.PrimaryKeyConstraint('id')
|
||||||
|
)
|
||||||
|
op.create_index(op.f('ix_webhook_endpoints_hotel_id'), 'webhook_endpoints', ['hotel_id'], unique=False)
|
||||||
|
op.create_index(op.f('ix_webhook_endpoints_webhook_secret'), 'webhook_endpoints', ['webhook_secret'], unique=True)
|
||||||
|
op.create_index('idx_webhook_endpoint_hotel_type', 'webhook_endpoints', ['hotel_id', 'webhook_type'], unique=False)
|
||||||
|
|
||||||
|
# Create webhook_requests table
|
||||||
|
op.create_table(
|
||||||
|
'webhook_requests',
|
||||||
|
sa.Column('id', sa.Integer(), nullable=False),
|
||||||
|
sa.Column('payload_hash', sa.String(length=64), nullable=False),
|
||||||
|
sa.Column('webhook_endpoint_id', sa.Integer(), nullable=True),
|
||||||
|
sa.Column('hotel_id', sa.String(length=50), nullable=True),
|
||||||
|
sa.Column('status', sa.String(length=20), nullable=False, default='pending'),
|
||||||
|
sa.Column('processing_started_at', sa.DateTime(timezone=True), nullable=True),
|
||||||
|
sa.Column('processing_completed_at', sa.DateTime(timezone=True), nullable=True),
|
||||||
|
sa.Column('retry_count', sa.Integer(), nullable=True, default=0),
|
||||||
|
sa.Column('last_error', sa.String(length=2000), nullable=True),
|
||||||
|
sa.Column('payload_json', sa.JSON(), nullable=True),
|
||||||
|
sa.Column('payload_size_bytes', sa.Integer(), nullable=True),
|
||||||
|
sa.Column('purged_at', sa.DateTime(timezone=True), nullable=True),
|
||||||
|
sa.Column('created_at', sa.DateTime(timezone=True), nullable=False),
|
||||||
|
sa.Column('source_ip', sa.String(length=45), nullable=True),
|
||||||
|
sa.Column('user_agent', sa.String(length=500), nullable=True),
|
||||||
|
sa.Column('created_customer_id', sa.Integer(), nullable=True),
|
||||||
|
sa.Column('created_reservation_id', sa.Integer(), nullable=True),
|
||||||
|
sa.ForeignKeyConstraint(['webhook_endpoint_id'], ['webhook_endpoints.id'], ),
|
||||||
|
sa.ForeignKeyConstraint(['hotel_id'], ['hotels.hotel_id'], ),
|
||||||
|
sa.ForeignKeyConstraint(['created_customer_id'], ['customers.id'], ),
|
||||||
|
sa.ForeignKeyConstraint(['created_reservation_id'], ['reservations.id'], ),
|
||||||
|
sa.PrimaryKeyConstraint('id')
|
||||||
|
)
|
||||||
|
op.create_index(op.f('ix_webhook_requests_payload_hash'), 'webhook_requests', ['payload_hash'], unique=True)
|
||||||
|
op.create_index(op.f('ix_webhook_requests_webhook_endpoint_id'), 'webhook_requests', ['webhook_endpoint_id'], unique=False)
|
||||||
|
op.create_index(op.f('ix_webhook_requests_hotel_id'), 'webhook_requests', ['hotel_id'], unique=False)
|
||||||
|
op.create_index(op.f('ix_webhook_requests_status'), 'webhook_requests', ['status'], unique=False)
|
||||||
|
op.create_index(op.f('ix_webhook_requests_created_at'), 'webhook_requests', ['created_at'], unique=False)
|
||||||
|
op.create_index('idx_webhook_status_created', 'webhook_requests', ['status', 'created_at'], unique=False)
|
||||||
|
op.create_index('idx_webhook_hotel_created', 'webhook_requests', ['hotel_id', 'created_at'], unique=False)
|
||||||
|
op.create_index('idx_webhook_purge_candidate', 'webhook_requests', ['status', 'purged_at', 'created_at'], unique=False)
|
||||||
|
|
||||||
|
|
||||||
|
def downgrade() -> None:
|
||||||
|
"""Downgrade schema."""
|
||||||
|
# Drop tables in reverse order (respecting foreign key constraints)
|
||||||
|
op.drop_index('idx_webhook_purge_candidate', table_name='webhook_requests')
|
||||||
|
op.drop_index('idx_webhook_hotel_created', table_name='webhook_requests')
|
||||||
|
op.drop_index('idx_webhook_status_created', table_name='webhook_requests')
|
||||||
|
op.drop_index(op.f('ix_webhook_requests_created_at'), table_name='webhook_requests')
|
||||||
|
op.drop_index(op.f('ix_webhook_requests_status'), table_name='webhook_requests')
|
||||||
|
op.drop_index(op.f('ix_webhook_requests_hotel_id'), table_name='webhook_requests')
|
||||||
|
op.drop_index(op.f('ix_webhook_requests_webhook_endpoint_id'), table_name='webhook_requests')
|
||||||
|
op.drop_index(op.f('ix_webhook_requests_payload_hash'), table_name='webhook_requests')
|
||||||
|
op.drop_table('webhook_requests')
|
||||||
|
|
||||||
|
op.drop_index('idx_webhook_endpoint_hotel_type', table_name='webhook_endpoints')
|
||||||
|
op.drop_index(op.f('ix_webhook_endpoints_webhook_secret'), table_name='webhook_endpoints')
|
||||||
|
op.drop_index(op.f('ix_webhook_endpoints_hotel_id'), table_name='webhook_endpoints')
|
||||||
|
op.drop_table('webhook_endpoints')
|
||||||
|
|
||||||
|
op.drop_index(op.f('ix_hotels_is_active'), table_name='hotels')
|
||||||
|
op.drop_index(op.f('ix_hotels_username'), table_name='hotels')
|
||||||
|
op.drop_index(op.f('ix_hotels_hotel_id'), table_name='hotels')
|
||||||
|
op.drop_table('hotels')
|
||||||
@@ -392923,3 +392923,74 @@ DETAIL: Key (hotel_id, guest_id)=(39054_001, 28275) is not present in table "co
|
|||||||
[SQL: INSERT INTO conversions (reservation_id, customer_id, hashed_customer_id, hotel_id, guest_id, pms_reservation_id, reservation_number, reservation_date, creation_time, reservation_type, booking_channel, advertising_medium, advertising_partner, advertising_campagne, directly_attributable, guest_matched, created_at, updated_at) VALUES ($1::INTEGER, $2::INTEGER, $3::INTEGER, $4::VARCHAR, $5::VARCHAR, $6::VARCHAR, $7::VARCHAR, $8::DATE, $9::TIMESTAMP WITH TIME ZONE, $10::VARCHAR, $11::VARCHAR, $12::VARCHAR, $13::VARCHAR, $14::VARCHAR, $15::BOOLEAN, $16::BOOLEAN, $17::TIMESTAMP WITH TIME ZONE, $18::TIMESTAMP WITH TIME ZONE) RETURNING conversions.id]
|
[SQL: INSERT INTO conversions (reservation_id, customer_id, hashed_customer_id, hotel_id, guest_id, pms_reservation_id, reservation_number, reservation_date, creation_time, reservation_type, booking_channel, advertising_medium, advertising_partner, advertising_campagne, directly_attributable, guest_matched, created_at, updated_at) VALUES ($1::INTEGER, $2::INTEGER, $3::INTEGER, $4::VARCHAR, $5::VARCHAR, $6::VARCHAR, $7::VARCHAR, $8::DATE, $9::TIMESTAMP WITH TIME ZONE, $10::VARCHAR, $11::VARCHAR, $12::VARCHAR, $13::VARCHAR, $14::VARCHAR, $15::BOOLEAN, $16::BOOLEAN, $17::TIMESTAMP WITH TIME ZONE, $18::TIMESTAMP WITH TIME ZONE) RETURNING conversions.id]
|
||||||
[parameters: (None, None, None, '39054_001', '28275', '833', '532', datetime.date(2014, 5, 24), datetime.datetime(2014, 5, 24, 13, 16, 27), 'reservation', None, None, None, None, False, False, datetime.datetime(2025, 11, 19, 16, 23, 58, 58789), datetime.datetime(2025, 11, 19, 16, 23, 58, 58791))]
|
[parameters: (None, None, None, '39054_001', '28275', '833', '532', datetime.date(2014, 5, 24), datetime.datetime(2014, 5, 24, 13, 16, 27), 'reservation', None, None, None, None, False, False, datetime.datetime(2025, 11, 19, 16, 23, 58, 58789), datetime.datetime(2025, 11, 19, 16, 23, 58, 58791))]
|
||||||
(Background on this error at: https://sqlalche.me/e/20/gkpj)
|
(Background on this error at: https://sqlalche.me/e/20/gkpj)
|
||||||
|
2025-11-25 12:02:21 - root - INFO - Logging to file: config/alpinebits.log
|
||||||
|
2025-11-25 12:02:21 - root - INFO - Logging configured at INFO level
|
||||||
|
2025-11-25 12:02:21 - alpine_bits_python.notification_service - INFO - Registered notification backend: pushover
|
||||||
|
2025-11-25 12:02:21 - alpine_bits_python.notification_manager - INFO - Registered pushover backend with priority 0
|
||||||
|
2025-11-25 12:02:21 - alpine_bits_python.notification_manager - INFO - Notification service configured with backends: ['pushover']
|
||||||
|
2025-11-25 12:02:21 - alpine_bits_python.api - INFO - Application startup initiated (primary_worker=True)
|
||||||
|
2025-11-25 12:02:21 - alpine_bits_python.db - INFO - Configured database schema: alpinebits
|
||||||
|
2025-11-25 12:02:21 - alpine_bits_python.db - INFO - Setting PostgreSQL search_path to: alpinebits,public
|
||||||
|
2025-11-25 12:02:21 - alpine_bits_python.alpinebits_server - INFO - Initializing action instance for AlpineBitsActionName.OTA_HOTEL_NOTIF_REPORT
|
||||||
|
2025-11-25 12:02:21 - alpine_bits_python.alpinebits_server - INFO - Initializing action instance for AlpineBitsActionName.OTA_PING
|
||||||
|
2025-11-25 12:02:21 - alpine_bits_python.alpinebits_server - INFO - Initializing action instance for AlpineBitsActionName.OTA_HOTEL_RES_NOTIF_GUEST_REQUESTS
|
||||||
|
2025-11-25 12:02:21 - alpine_bits_python.alpinebits_server - INFO - Initializing action instance for AlpineBitsActionName.OTA_READ
|
||||||
|
2025-11-25 12:02:21 - alpine_bits_python.webhook_processor - INFO - Registered webhook processor: wix_form
|
||||||
|
2025-11-25 12:02:21 - alpine_bits_python.webhook_processor - INFO - Registered webhook processor: generic
|
||||||
|
2025-11-25 12:02:21 - alpine_bits_python.webhook_processor - INFO - Webhook processors initialized
|
||||||
|
2025-11-25 12:02:21 - alpine_bits_python.api - INFO - Webhook processors initialized
|
||||||
|
2025-11-25 12:02:21 - alpine_bits_python.api - INFO - Hotel 39054_001 has no push_endpoint configured
|
||||||
|
2025-11-25 12:02:21 - alpine_bits_python.api - INFO - Hotel 135 has no push_endpoint configured
|
||||||
|
2025-11-25 12:02:21 - alpine_bits_python.api - INFO - Hotel 39052_001 has no push_endpoint configured
|
||||||
|
2025-11-25 12:02:21 - alpine_bits_python.api - INFO - Hotel 39040_001 has no push_endpoint configured
|
||||||
|
2025-11-25 12:02:21 - alpine_bits_python.api - INFO - Running startup tasks (primary worker)...
|
||||||
|
2025-11-25 12:03:13 - root - INFO - Logging to file: config/alpinebits.log
|
||||||
|
2025-11-25 12:03:13 - root - INFO - Logging configured at INFO level
|
||||||
|
2025-11-25 12:03:13 - alpine_bits_python.notification_service - INFO - Registered notification backend: pushover
|
||||||
|
2025-11-25 12:03:13 - alpine_bits_python.notification_manager - INFO - Registered pushover backend with priority 0
|
||||||
|
2025-11-25 12:03:13 - alpine_bits_python.notification_manager - INFO - Notification service configured with backends: ['pushover']
|
||||||
|
2025-11-25 12:03:13 - alpine_bits_python.api - INFO - Application startup initiated (primary_worker=True)
|
||||||
|
2025-11-25 12:03:13 - alpine_bits_python.db - INFO - Configured database schema: alpinebits
|
||||||
|
2025-11-25 12:03:13 - alpine_bits_python.db - INFO - Setting PostgreSQL search_path to: alpinebits,public
|
||||||
|
2025-11-25 12:03:13 - alpine_bits_python.alpinebits_server - INFO - Initializing action instance for AlpineBitsActionName.OTA_HOTEL_NOTIF_REPORT
|
||||||
|
2025-11-25 12:03:13 - alpine_bits_python.alpinebits_server - INFO - Initializing action instance for AlpineBitsActionName.OTA_PING
|
||||||
|
2025-11-25 12:03:13 - alpine_bits_python.alpinebits_server - INFO - Initializing action instance for AlpineBitsActionName.OTA_HOTEL_RES_NOTIF_GUEST_REQUESTS
|
||||||
|
2025-11-25 12:03:13 - alpine_bits_python.alpinebits_server - INFO - Initializing action instance for AlpineBitsActionName.OTA_READ
|
||||||
|
2025-11-25 12:03:13 - alpine_bits_python.webhook_processor - INFO - Registered webhook processor: wix_form
|
||||||
|
2025-11-25 12:03:13 - alpine_bits_python.webhook_processor - INFO - Registered webhook processor: generic
|
||||||
|
2025-11-25 12:03:13 - alpine_bits_python.webhook_processor - INFO - Webhook processors initialized
|
||||||
|
2025-11-25 12:03:13 - alpine_bits_python.api - INFO - Webhook processors initialized
|
||||||
|
2025-11-25 12:03:13 - alpine_bits_python.api - INFO - Hotel 39054_001 has no push_endpoint configured
|
||||||
|
2025-11-25 12:03:13 - alpine_bits_python.api - INFO - Hotel 135 has no push_endpoint configured
|
||||||
|
2025-11-25 12:03:13 - alpine_bits_python.api - INFO - Hotel 39052_001 has no push_endpoint configured
|
||||||
|
2025-11-25 12:03:13 - alpine_bits_python.api - INFO - Hotel 39040_001 has no push_endpoint configured
|
||||||
|
2025-11-25 12:03:13 - alpine_bits_python.api - INFO - Running startup tasks (primary worker)...
|
||||||
|
2025-11-25 12:03:14 - alpine_bits_python.hotel_service - INFO - Created hotel: 39054_001
|
||||||
|
2025-11-25 12:03:14 - alpine_bits_python.hotel_service - INFO - Created webhook endpoint for hotel 39054_001, type=wix_form, secret=E3wZiShNY47_KqwZUxBe7BJKluorJYj6qHfclrQg1UnyurYojyw_f0Z1KJBrXIuG
|
||||||
|
2025-11-25 12:03:14 - alpine_bits_python.hotel_service - INFO - Created webhook endpoint for hotel 39054_001, type=generic, secret=PmhN4o5MR4VYR9U04kwgp4nRnJ2FMfxm1V2TARIh46Qx49Iy0sndPzeIYXM31KTU
|
||||||
|
2025-11-25 12:03:14 - alpine_bits_python.hotel_service - INFO - Created hotel: 135
|
||||||
|
2025-11-25 12:03:14 - alpine_bits_python.hotel_service - INFO - Created webhook endpoint for hotel 135, type=wix_form, secret=Rb0Dp1mFHZH5vbF66ZsUXoSiIVJdVjskxjlz-PLYXglaJ_DXGo7B7dtw0xbO15_O
|
||||||
|
2025-11-25 12:03:14 - alpine_bits_python.hotel_service - INFO - Created webhook endpoint for hotel 135, type=generic, secret=KCblY6u535uAgQ-nk0DS24FsilkS73hsplVecXy8vFg0GqDFX1lsF2U4JGmPURtf
|
||||||
|
2025-11-25 12:03:14 - alpine_bits_python.hotel_service - INFO - Created hotel: 39052_001
|
||||||
|
2025-11-25 12:03:14 - alpine_bits_python.hotel_service - INFO - Created webhook endpoint for hotel 39052_001, type=wix_form, secret=d-oKOTZ4GqcNIdnR6cMcHtpEUWXNpFgwbc0qRXS_9m1J2vJRYHTjWs3pb8XvF3B_
|
||||||
|
2025-11-25 12:03:14 - alpine_bits_python.hotel_service - INFO - Created webhook endpoint for hotel 39052_001, type=generic, secret=bdO04CoGOHyl7P8zFOB8dNxaxkccNNyXgM_za9pPiFqm4LmUM4KQ0l2qfOTu7gyM
|
||||||
|
2025-11-25 12:03:14 - alpine_bits_python.hotel_service - INFO - Created hotel: 39040_001
|
||||||
|
2025-11-25 12:03:14 - alpine_bits_python.hotel_service - INFO - Created webhook endpoint for hotel 39040_001, type=wix_form, secret=Arj5jksgLxgJcad9OAGIZzWfF1x1g6g965EZKGp-njDsF2oK-jiODYlN4HiO46cz
|
||||||
|
2025-11-25 12:03:14 - alpine_bits_python.hotel_service - INFO - Created webhook endpoint for hotel 39040_001, type=generic, secret=A-_w63IXXmwsztd1pN6wDEtvr_oKO1GRROO5ff9lad0VTAh7WH5mzqUwdi9H8Be-
|
||||||
|
2025-11-25 12:03:14 - alpine_bits_python.hotel_service - INFO - Config sync complete: 4 hotels created, 0 updated, 8 endpoints created
|
||||||
|
2025-11-25 12:03:14 - alpine_bits_python.db_setup - INFO - Config sync: 4 hotels created, 0 updated, 8 endpoints created
|
||||||
|
2025-11-25 12:03:15 - alpine_bits_python.db_setup - INFO - Backfilling advertising account IDs for existing reservations...
|
||||||
|
2025-11-25 12:03:15 - alpine_bits_python.db_setup - INFO - Found 4 hotel(s) with account configurations
|
||||||
|
2025-11-25 12:03:15 - alpine_bits_python.db_setup - INFO - Backfilling usernames for existing acked_requests...
|
||||||
|
2025-11-25 12:03:15 - alpine_bits_python.db_setup - INFO - Found 4 hotel(s) with usernames in config
|
||||||
|
2025-11-25 12:03:15 - alpine_bits_python.api - INFO - Startup tasks completed
|
||||||
|
2025-11-25 12:03:15 - alpine_bits_python.api - INFO - Webhook periodic cleanup task started
|
||||||
|
2025-11-25 12:03:15 - alpine_bits_python.api - INFO - Application startup complete
|
||||||
|
2025-11-25 12:03:35 - alpine_bits_python.api - INFO - Application shutdown initiated
|
||||||
|
2025-11-25 12:03:35 - alpine_bits_python.api - INFO - Webhook cleanup task cancelled
|
||||||
|
2025-11-25 12:03:35 - alpine_bits_python.api - INFO - Webhook cleanup task stopped
|
||||||
|
2025-11-25 12:03:35 - alpine_bits_python.email_service - INFO - Shutting down email service thread pool
|
||||||
|
2025-11-25 12:03:35 - alpine_bits_python.email_service - INFO - Email service thread pool shut down complete
|
||||||
|
2025-11-25 12:03:35 - alpine_bits_python.api - INFO - Email service shut down
|
||||||
|
2025-11-25 12:03:35 - alpine_bits_python.api - INFO - Application shutdown complete
|
||||||
|
2025-11-25 12:03:35 - alpine_bits_python.worker_coordination - INFO - Released primary worker lock (pid=22943)
|
||||||
|
|||||||
@@ -13,6 +13,7 @@ dependencies = [
|
|||||||
"alembic>=1.17.2",
|
"alembic>=1.17.2",
|
||||||
"annotatedyaml>=1.0.0",
|
"annotatedyaml>=1.0.0",
|
||||||
"asyncpg>=0.30.0",
|
"asyncpg>=0.30.0",
|
||||||
|
"bcrypt>=5.0.0",
|
||||||
"dotenv>=0.9.9",
|
"dotenv>=0.9.9",
|
||||||
"fast-langdetect>=1.0.0",
|
"fast-langdetect>=1.0.0",
|
||||||
"fastapi>=0.117.1",
|
"fastapi>=0.117.1",
|
||||||
|
|||||||
@@ -2,6 +2,7 @@
|
|||||||
|
|
||||||
import asyncio
|
import asyncio
|
||||||
import gzip
|
import gzip
|
||||||
|
import hashlib
|
||||||
import json
|
import json
|
||||||
import multiprocessing
|
import multiprocessing
|
||||||
import os
|
import os
|
||||||
@@ -9,7 +10,8 @@ import traceback
|
|||||||
import urllib.parse
|
import urllib.parse
|
||||||
import xml.dom.minidom
|
import xml.dom.minidom
|
||||||
from collections import defaultdict
|
from collections import defaultdict
|
||||||
from datetime import date, datetime
|
from contextlib import asynccontextmanager
|
||||||
|
from datetime import UTC, date, datetime, timedelta
|
||||||
from functools import partial
|
from functools import partial
|
||||||
from pathlib import Path
|
from pathlib import Path
|
||||||
from typing import Any
|
from typing import Any
|
||||||
@@ -27,10 +29,13 @@ from fastapi.security import (
|
|||||||
)
|
)
|
||||||
from pydantic import BaseModel
|
from pydantic import BaseModel
|
||||||
from slowapi.errors import RateLimitExceeded
|
from slowapi.errors import RateLimitExceeded
|
||||||
|
from sqlalchemy import and_, select, update
|
||||||
from sqlalchemy.exc import IntegrityError
|
from sqlalchemy.exc import IntegrityError
|
||||||
from sqlalchemy.ext.asyncio import async_sessionmaker
|
from sqlalchemy.ext.asyncio import async_sessionmaker
|
||||||
|
from sqlalchemy.orm import selectinload
|
||||||
|
|
||||||
from alpine_bits_python.schemas import ReservationData
|
from alpine_bits_python.schemas import ReservationData
|
||||||
|
from alpine_bits_python.webhook_processor import process_generic_webhook_submission
|
||||||
|
|
||||||
from .alpinebits_server import (
|
from .alpinebits_server import (
|
||||||
AlpineBitsActionName,
|
AlpineBitsActionName,
|
||||||
@@ -46,6 +51,7 @@ from .csv_import import CSVImporter
|
|||||||
from .customer_service import CustomerService
|
from .customer_service import CustomerService
|
||||||
from .db import Customer as DBCustomer
|
from .db import Customer as DBCustomer
|
||||||
from .db import Reservation as DBReservation
|
from .db import Reservation as DBReservation
|
||||||
|
from .db import Hotel, WebhookEndpoint, WebhookRequest
|
||||||
from .db import ResilientAsyncSession, SessionMaker, create_database_engine
|
from .db import ResilientAsyncSession, SessionMaker, create_database_engine
|
||||||
from .db_setup import run_startup_tasks
|
from .db_setup import run_startup_tasks
|
||||||
from .email_monitoring import ReservationStatsCollector
|
from .email_monitoring import ReservationStatsCollector
|
||||||
@@ -61,6 +67,7 @@ from .rate_limit import (
|
|||||||
webhook_limiter,
|
webhook_limiter,
|
||||||
)
|
)
|
||||||
from .reservation_service import ReservationService
|
from .reservation_service import ReservationService
|
||||||
|
from .webhook_processor import webhook_registry
|
||||||
from .worker_coordination import is_primary_worker
|
from .worker_coordination import is_primary_worker
|
||||||
|
|
||||||
# Configure logging - will be reconfigured during lifespan with actual config
|
# Configure logging - will be reconfigured during lifespan with actual config
|
||||||
@@ -249,9 +256,113 @@ async def push_listener(customer: DBCustomer, reservation: DBReservation, hotel)
|
|||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
_LOGGER.exception("Push event failed for hotel %s: %s", hotel["hotel_id"], e)
|
_LOGGER.exception("Push event failed for hotel %s: %s", hotel["hotel_id"], e)
|
||||||
# Optionally implement retry logic here@asynccontextmanager
|
# Optionally implement retry logic here
|
||||||
|
|
||||||
|
|
||||||
|
async def cleanup_stale_webhooks(
|
||||||
|
async_sessionmaker: async_sessionmaker,
|
||||||
|
timeout_minutes: int = 10
|
||||||
|
) -> int:
|
||||||
|
"""Reset webhooks stuck in 'processing' (worker crashed).
|
||||||
|
|
||||||
|
Args:
|
||||||
|
async_sessionmaker: SQLAlchemy async sessionmaker
|
||||||
|
timeout_minutes: Timeout threshold in minutes
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Number of stale webhooks reset
|
||||||
|
"""
|
||||||
|
timeout_threshold = datetime.now(UTC) - timedelta(minutes=timeout_minutes)
|
||||||
|
|
||||||
|
async with async_sessionmaker() as session:
|
||||||
|
result = await session.execute(
|
||||||
|
update(WebhookRequest)
|
||||||
|
.where(
|
||||||
|
and_(
|
||||||
|
WebhookRequest.status == 'processing',
|
||||||
|
WebhookRequest.processing_started_at < timeout_threshold
|
||||||
|
)
|
||||||
|
)
|
||||||
|
.values(
|
||||||
|
status='failed',
|
||||||
|
last_error='Processing timeout - worker may have crashed'
|
||||||
|
)
|
||||||
|
)
|
||||||
|
await session.commit()
|
||||||
|
count = result.rowcount
|
||||||
|
|
||||||
|
if count > 0:
|
||||||
|
_LOGGER.warning("Reset %d stale webhooks to 'failed'", count)
|
||||||
|
|
||||||
|
return count
|
||||||
|
|
||||||
|
|
||||||
|
async def purge_old_webhook_payloads(
|
||||||
|
async_sessionmaker: async_sessionmaker,
|
||||||
|
retention_days: int = 7
|
||||||
|
) -> int:
|
||||||
|
"""Purge payload_json from old completed webhooks.
|
||||||
|
|
||||||
|
Keeps metadata for history but removes large JSON payload.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
async_sessionmaker: SQLAlchemy async sessionmaker
|
||||||
|
retention_days: Days to retain payloads before purging
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Number of payloads purged
|
||||||
|
"""
|
||||||
|
cutoff_date = datetime.now(UTC) - timedelta(days=retention_days)
|
||||||
|
|
||||||
|
async with async_sessionmaker() as session:
|
||||||
|
result = await session.execute(
|
||||||
|
update(WebhookRequest)
|
||||||
|
.where(
|
||||||
|
and_(
|
||||||
|
WebhookRequest.status == 'completed',
|
||||||
|
WebhookRequest.created_at < cutoff_date,
|
||||||
|
WebhookRequest.purged_at.is_(None) # Not already purged
|
||||||
|
)
|
||||||
|
)
|
||||||
|
.values(
|
||||||
|
payload_json=None,
|
||||||
|
purged_at=datetime.now(UTC)
|
||||||
|
)
|
||||||
|
)
|
||||||
|
await session.commit()
|
||||||
|
count = result.rowcount
|
||||||
|
|
||||||
|
if count > 0:
|
||||||
|
_LOGGER.info("Purged payloads from %d old webhook requests", count)
|
||||||
|
|
||||||
|
return count
|
||||||
|
|
||||||
|
|
||||||
|
async def periodic_webhook_cleanup(async_sessionmaker: async_sessionmaker):
|
||||||
|
"""Run periodic cleanup tasks for webhooks.
|
||||||
|
|
||||||
|
This should be scheduled to run every 5-10 minutes.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
async_sessionmaker: SQLAlchemy async sessionmaker
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
# Clean up stale webhooks (stuck in 'processing')
|
||||||
|
stale_count = await cleanup_stale_webhooks(async_sessionmaker)
|
||||||
|
|
||||||
|
# Purge old webhook payloads (older than 7 days)
|
||||||
|
purged_count = await purge_old_webhook_payloads(async_sessionmaker)
|
||||||
|
|
||||||
|
_LOGGER.debug(
|
||||||
|
"Webhook cleanup: %d stale reset, %d payloads purged",
|
||||||
|
stale_count,
|
||||||
|
purged_count
|
||||||
|
)
|
||||||
|
except Exception as e:
|
||||||
|
_LOGGER.exception("Error during periodic webhook cleanup: %s", e)
|
||||||
|
|
||||||
|
|
||||||
|
@asynccontextmanager
|
||||||
async def lifespan(app: FastAPI):
|
async def lifespan(app: FastAPI):
|
||||||
# Setup DB
|
# Setup DB
|
||||||
|
|
||||||
@@ -311,6 +422,11 @@ async def lifespan(app: FastAPI):
|
|||||||
app.state.alert_handler = alert_handler
|
app.state.alert_handler = alert_handler
|
||||||
app.state.report_scheduler = report_scheduler
|
app.state.report_scheduler = report_scheduler
|
||||||
|
|
||||||
|
# Initialize webhook processors
|
||||||
|
from .webhook_processor import initialize_webhook_processors
|
||||||
|
initialize_webhook_processors()
|
||||||
|
_LOGGER.info("Webhook processors initialized")
|
||||||
|
|
||||||
# Register push listeners for hotels with push_endpoint
|
# Register push listeners for hotels with push_endpoint
|
||||||
for hotel in config.get("alpine_bits_auth", []):
|
for hotel in config.get("alpine_bits_auth", []):
|
||||||
push_endpoint = hotel.get("push_endpoint")
|
push_endpoint = hotel.get("push_endpoint")
|
||||||
@@ -356,6 +472,24 @@ async def lifespan(app: FastAPI):
|
|||||||
report_scheduler.start()
|
report_scheduler.start()
|
||||||
_LOGGER.info("Daily report scheduler started")
|
_LOGGER.info("Daily report scheduler started")
|
||||||
|
|
||||||
|
# Start periodic webhook cleanup (only on primary worker)
|
||||||
|
cleanup_task = None
|
||||||
|
if is_primary:
|
||||||
|
async def run_periodic_cleanup():
|
||||||
|
"""Run cleanup tasks every 5 minutes."""
|
||||||
|
while True:
|
||||||
|
try:
|
||||||
|
await asyncio.sleep(300) # 5 minutes
|
||||||
|
await periodic_webhook_cleanup(AsyncSessionLocal)
|
||||||
|
except asyncio.CancelledError:
|
||||||
|
_LOGGER.info("Webhook cleanup task cancelled")
|
||||||
|
break
|
||||||
|
except Exception as e:
|
||||||
|
_LOGGER.exception("Error in periodic webhook cleanup: %s", e)
|
||||||
|
|
||||||
|
cleanup_task = asyncio.create_task(run_periodic_cleanup())
|
||||||
|
_LOGGER.info("Webhook periodic cleanup task started")
|
||||||
|
|
||||||
_LOGGER.info("Application startup complete")
|
_LOGGER.info("Application startup complete")
|
||||||
|
|
||||||
yield
|
yield
|
||||||
@@ -363,6 +497,15 @@ async def lifespan(app: FastAPI):
|
|||||||
# Cleanup on shutdown
|
# Cleanup on shutdown
|
||||||
_LOGGER.info("Application shutdown initiated")
|
_LOGGER.info("Application shutdown initiated")
|
||||||
|
|
||||||
|
# Stop webhook cleanup task
|
||||||
|
if cleanup_task:
|
||||||
|
cleanup_task.cancel()
|
||||||
|
try:
|
||||||
|
await cleanup_task
|
||||||
|
except asyncio.CancelledError:
|
||||||
|
pass
|
||||||
|
_LOGGER.info("Webhook cleanup task stopped")
|
||||||
|
|
||||||
# Stop daily report scheduler
|
# Stop daily report scheduler
|
||||||
if report_scheduler:
|
if report_scheduler:
|
||||||
report_scheduler.stop()
|
report_scheduler.stop()
|
||||||
@@ -761,266 +904,6 @@ async def process_wix_form_submission(request: Request, data: dict[str, Any], db
|
|||||||
}
|
}
|
||||||
|
|
||||||
|
|
||||||
async def process_generic_webhook_submission(
|
|
||||||
request: Request, data: dict[str, Any], db
|
|
||||||
):
|
|
||||||
"""Process generic webhook submissions with nested structure.
|
|
||||||
|
|
||||||
Expected structure:
|
|
||||||
{
|
|
||||||
"hotel_data": {"hotelname": "...", "hotelcode": "..."},
|
|
||||||
"form_data": {
|
|
||||||
"sprache": "de/it/en",
|
|
||||||
"anreise": "DD.MM.YYYY",
|
|
||||||
"abreise": "DD.MM.YYYY",
|
|
||||||
"erwachsene": "N",
|
|
||||||
"kinder": "N",
|
|
||||||
"alter": {"1": "age1", "2": "age2", ...},
|
|
||||||
"anrede": "...",
|
|
||||||
"name": "...",
|
|
||||||
"nachname": "...",
|
|
||||||
"mail": "...",
|
|
||||||
"tel": "...",
|
|
||||||
"nachricht": "..."
|
|
||||||
},
|
|
||||||
"tracking_data": {
|
|
||||||
"utm_source": "...",
|
|
||||||
"utm_medium": "...",
|
|
||||||
"utm_campaign": "...",
|
|
||||||
"utm_content": "...",
|
|
||||||
"utm_term": "...",
|
|
||||||
"fbclid": "...",
|
|
||||||
"gclid": "..."
|
|
||||||
},
|
|
||||||
"timestamp": "ISO8601"
|
|
||||||
}
|
|
||||||
"""
|
|
||||||
timestamp = datetime.now().isoformat()
|
|
||||||
_LOGGER.info("Processing generic webhook submission at %s", timestamp)
|
|
||||||
|
|
||||||
# Extract nested data
|
|
||||||
hotel_data = data.get("hotel_data", {})
|
|
||||||
form_data = data.get("form_data", {})
|
|
||||||
tracking_data = data.get("tracking_data", {})
|
|
||||||
offer_data = data.get("unterkunftTyp", {})
|
|
||||||
|
|
||||||
selected_offers = []
|
|
||||||
|
|
||||||
if offer_data:
|
|
||||||
# grab keys and values. If value is "on" add the key not the value to a list of selected offers
|
|
||||||
|
|
||||||
offer_data: dict[str, str]
|
|
||||||
|
|
||||||
for key, value in offer_data.items():
|
|
||||||
if value == "on":
|
|
||||||
selected_offers.append(key)
|
|
||||||
|
|
||||||
selected_offers_str = ", ".join(selected_offers) if selected_offers else None
|
|
||||||
|
|
||||||
# Extract hotel information
|
|
||||||
hotel_code = hotel_data.get("hotelcode")
|
|
||||||
hotel_name = hotel_data.get("hotelname")
|
|
||||||
|
|
||||||
if not hotel_code:
|
|
||||||
_LOGGER.warning("No hotel_code provided in webhook data, using default")
|
|
||||||
hotel_code = request.app.state.config.get("default_hotel_code", "123")
|
|
||||||
|
|
||||||
if not hotel_name:
|
|
||||||
hotel_name = (
|
|
||||||
request.app.state.config.get("default_hotel_name") or "Frangart Inn"
|
|
||||||
)
|
|
||||||
|
|
||||||
# Extract customer information
|
|
||||||
first_name = form_data.get("name")
|
|
||||||
last_name = form_data.get("nachname")
|
|
||||||
email = form_data.get("mail")
|
|
||||||
phone_number = form_data.get("tel")
|
|
||||||
name_prefix = form_data.get("anrede")
|
|
||||||
language = form_data.get("sprache", "de")[:2]
|
|
||||||
user_comment = form_data.get("nachricht", "")
|
|
||||||
plz = form_data.get("plz", "")
|
|
||||||
city = form_data.get("stadt", "")
|
|
||||||
country = form_data.get("land", "")
|
|
||||||
|
|
||||||
# Parse dates - handle DD.MM.YYYY format
|
|
||||||
start_date_str = form_data.get("anreise")
|
|
||||||
end_date_str = form_data.get("abreise")
|
|
||||||
|
|
||||||
if not start_date_str or not end_date_str:
|
|
||||||
raise HTTPException(
|
|
||||||
status_code=400, detail="Missing required dates (anreise/abreise)"
|
|
||||||
)
|
|
||||||
|
|
||||||
try:
|
|
||||||
# Parse DD.MM.YYYY format using strptime
|
|
||||||
start_date = datetime.strptime(start_date_str, "%d.%m.%Y").date()
|
|
||||||
end_date = datetime.strptime(end_date_str, "%d.%m.%Y").date()
|
|
||||||
except ValueError as e:
|
|
||||||
_LOGGER.error(
|
|
||||||
"Error parsing dates: start=%s, end=%s, error=%s",
|
|
||||||
start_date_str,
|
|
||||||
end_date_str,
|
|
||||||
e,
|
|
||||||
)
|
|
||||||
raise HTTPException(status_code=400, detail=f"Invalid date format: {e}") from e
|
|
||||||
|
|
||||||
# Extract room/guest info
|
|
||||||
num_adults = int(form_data.get("erwachsene", 2))
|
|
||||||
num_children = int(form_data.get("kinder", 0))
|
|
||||||
|
|
||||||
# Extract children ages from nested structure
|
|
||||||
children_ages = []
|
|
||||||
if num_children > 0:
|
|
||||||
alter_data = form_data.get("alter", {})
|
|
||||||
for i in range(1, num_children + 1):
|
|
||||||
age_str = alter_data.get(str(i))
|
|
||||||
if age_str:
|
|
||||||
try:
|
|
||||||
children_ages.append(int(age_str))
|
|
||||||
except ValueError:
|
|
||||||
_LOGGER.warning("Invalid age value for child %d: %s", i, age_str)
|
|
||||||
|
|
||||||
# Extract tracking information
|
|
||||||
utm_source = None
|
|
||||||
utm_medium = None
|
|
||||||
utm_campaign = None
|
|
||||||
utm_term = None
|
|
||||||
utm_content = None
|
|
||||||
fbclid = None
|
|
||||||
gclid = None
|
|
||||||
|
|
||||||
if tracking_data:
|
|
||||||
utm_source = tracking_data.get("utm_source")
|
|
||||||
utm_medium = tracking_data.get("utm_medium")
|
|
||||||
utm_campaign = tracking_data.get("utm_campaign")
|
|
||||||
utm_term = tracking_data.get("utm_term")
|
|
||||||
utm_content = tracking_data.get("utm_content")
|
|
||||||
fbclid = tracking_data.get("fbclid")
|
|
||||||
gclid = tracking_data.get("gclid")
|
|
||||||
|
|
||||||
# Parse submission timestamp
|
|
||||||
submission_time = data.get("timestamp")
|
|
||||||
try:
|
|
||||||
if submission_time:
|
|
||||||
# Handle ISO8601 format with timezone
|
|
||||||
if submission_time.endswith("Z"):
|
|
||||||
submission_time = datetime.fromisoformat(submission_time[:-1])
|
|
||||||
elif "+" in submission_time:
|
|
||||||
# Remove timezone info (e.g., +02:00)
|
|
||||||
submission_time = datetime.fromisoformat(submission_time.split("+")[0])
|
|
||||||
else:
|
|
||||||
submission_time = datetime.fromisoformat(submission_time)
|
|
||||||
except Exception as e:
|
|
||||||
_LOGGER.exception("Error parsing submission timestamp: %s", e)
|
|
||||||
submission_time = None
|
|
||||||
|
|
||||||
# Generate unique ID
|
|
||||||
unique_id = generate_unique_id()
|
|
||||||
|
|
||||||
# Use CustomerService to handle customer creation/update with hashing
|
|
||||||
customer_service = CustomerService(db)
|
|
||||||
|
|
||||||
customer_data = {
|
|
||||||
"given_name": first_name,
|
|
||||||
"surname": last_name,
|
|
||||||
"contact_id": None,
|
|
||||||
"name_prefix": name_prefix if name_prefix != "--" else None,
|
|
||||||
"email_address": email,
|
|
||||||
"phone": phone_number if phone_number else None,
|
|
||||||
"email_newsletter": False,
|
|
||||||
"address_line": None,
|
|
||||||
"city_name": city if city else None,
|
|
||||||
"postal_code": plz if plz else None,
|
|
||||||
"country_code": country if country else None,
|
|
||||||
"gender": None,
|
|
||||||
"birth_date": None,
|
|
||||||
"language": language,
|
|
||||||
"address_catalog": False,
|
|
||||||
"name_title": None,
|
|
||||||
}
|
|
||||||
|
|
||||||
# Create/update customer
|
|
||||||
db_customer = await customer_service.get_or_create_customer(customer_data)
|
|
||||||
|
|
||||||
# Get advertising account IDs conditionally based on fbclid/gclid presence
|
|
||||||
meta_account_id, google_account_id = get_advertising_account_ids(
|
|
||||||
request.app.state.config, hotel_code, fbclid, gclid
|
|
||||||
)
|
|
||||||
|
|
||||||
# Create reservation
|
|
||||||
reservation_kwargs = {
|
|
||||||
"unique_id": unique_id,
|
|
||||||
"start_date": start_date,
|
|
||||||
"end_date": end_date,
|
|
||||||
"num_adults": num_adults,
|
|
||||||
"num_children": num_children,
|
|
||||||
"children_ages": children_ages,
|
|
||||||
"hotel_code": hotel_code,
|
|
||||||
"hotel_name": hotel_name,
|
|
||||||
"offer": selected_offers_str,
|
|
||||||
"utm_source": utm_source,
|
|
||||||
"utm_medium": utm_medium,
|
|
||||||
"utm_campaign": utm_campaign,
|
|
||||||
"utm_term": utm_term,
|
|
||||||
"utm_content": utm_content,
|
|
||||||
"user_comment": user_comment,
|
|
||||||
"fbclid": fbclid,
|
|
||||||
"gclid": gclid,
|
|
||||||
"meta_account_id": meta_account_id,
|
|
||||||
"google_account_id": google_account_id,
|
|
||||||
}
|
|
||||||
|
|
||||||
# Only include created_at if we have a valid submission_time
|
|
||||||
if submission_time:
|
|
||||||
reservation_kwargs["created_at"] = submission_time
|
|
||||||
|
|
||||||
reservation = ReservationData(**reservation_kwargs)
|
|
||||||
|
|
||||||
if reservation.md5_unique_id is None:
|
|
||||||
raise HTTPException(status_code=400, detail="Failed to generate md5_unique_id")
|
|
||||||
|
|
||||||
# Use ReservationService to create reservation
|
|
||||||
reservation_service = ReservationService(db)
|
|
||||||
db_reservation = await reservation_service.create_reservation(
|
|
||||||
reservation, db_customer.id
|
|
||||||
)
|
|
||||||
|
|
||||||
async def push_event():
|
|
||||||
# Fire event for listeners (push, etc.) - hotel-specific dispatch
|
|
||||||
dispatcher = getattr(request.app.state, "event_dispatcher", None)
|
|
||||||
if dispatcher:
|
|
||||||
# Get hotel_code from reservation to target the right listeners
|
|
||||||
hotel_code = getattr(db_reservation, "hotel_code", None)
|
|
||||||
if hotel_code and hotel_code.strip():
|
|
||||||
await dispatcher.dispatch_for_hotel(
|
|
||||||
"form_processed", hotel_code, db_customer, db_reservation
|
|
||||||
)
|
|
||||||
_LOGGER.info("Dispatched form_processed event for hotel %s", hotel_code)
|
|
||||||
else:
|
|
||||||
_LOGGER.warning(
|
|
||||||
"No hotel_code in reservation, skipping push notifications"
|
|
||||||
)
|
|
||||||
|
|
||||||
# Create task and store reference to prevent garbage collection
|
|
||||||
task = asyncio.create_task(push_event())
|
|
||||||
# Add done callback to log any exceptions
|
|
||||||
task.add_done_callback(lambda t: t.exception() if not t.cancelled() else None)
|
|
||||||
|
|
||||||
_LOGGER.info(
|
|
||||||
"Successfully processed generic webhook: customer_id=%s, reservation_id=%s",
|
|
||||||
db_customer.id,
|
|
||||||
db_reservation.id,
|
|
||||||
)
|
|
||||||
|
|
||||||
return {
|
|
||||||
"status": "success",
|
|
||||||
"message": "Generic webhook data processed successfully",
|
|
||||||
"customer_id": db_customer.id,
|
|
||||||
"reservation_id": db_reservation.id,
|
|
||||||
"timestamp": timestamp,
|
|
||||||
}
|
|
||||||
|
|
||||||
|
|
||||||
async def validate_basic_auth(
|
async def validate_basic_auth(
|
||||||
credentials: HTTPBasicCredentials = Depends(security_basic),
|
credentials: HTTPBasicCredentials = Depends(security_basic),
|
||||||
) -> str:
|
) -> str:
|
||||||
@@ -1058,6 +941,171 @@ async def validate_basic_auth(
|
|||||||
return credentials.username, credentials.password
|
return credentials.username, credentials.password
|
||||||
|
|
||||||
|
|
||||||
|
@api_router.post("/webhook/{webhook_secret}")
|
||||||
|
@webhook_limiter.limit(WEBHOOK_RATE_LIMIT)
|
||||||
|
async def handle_webhook_unified(
|
||||||
|
request: Request,
|
||||||
|
webhook_secret: str,
|
||||||
|
db_session=Depends(get_async_session),
|
||||||
|
):
|
||||||
|
"""Unified webhook handler with deduplication and routing.
|
||||||
|
|
||||||
|
Flow:
|
||||||
|
1. Look up webhook_endpoint by webhook_secret
|
||||||
|
2. Parse and hash payload (SHA256)
|
||||||
|
3. Check for duplicate using SELECT FOR UPDATE SKIP LOCKED
|
||||||
|
4. If duplicate and completed: return success (idempotent)
|
||||||
|
5. If duplicate and processing: return success (concurrent request)
|
||||||
|
6. Create or update webhook_request with status='processing'
|
||||||
|
7. Route to appropriate processor based on webhook_endpoint.webhook_type
|
||||||
|
8. Update status to 'completed' or 'failed'
|
||||||
|
9. Return response
|
||||||
|
"""
|
||||||
|
timestamp = datetime.now(UTC)
|
||||||
|
|
||||||
|
# 1. Look up webhook_endpoint
|
||||||
|
result = await db_session.execute(
|
||||||
|
select(WebhookEndpoint)
|
||||||
|
.where(
|
||||||
|
and_(
|
||||||
|
WebhookEndpoint.webhook_secret == webhook_secret,
|
||||||
|
WebhookEndpoint.is_enabled == True
|
||||||
|
)
|
||||||
|
)
|
||||||
|
.options(selectinload(WebhookEndpoint.hotel))
|
||||||
|
)
|
||||||
|
webhook_endpoint = result.scalar_one_or_none()
|
||||||
|
|
||||||
|
if not webhook_endpoint or not webhook_endpoint.hotel.is_active:
|
||||||
|
raise HTTPException(status_code=404, detail="Webhook not found")
|
||||||
|
|
||||||
|
# 2. Parse payload
|
||||||
|
body = await request.body()
|
||||||
|
|
||||||
|
# Handle gzip compression
|
||||||
|
if request.headers.get("content-encoding", "").lower() == "gzip":
|
||||||
|
try:
|
||||||
|
body = gzip.decompress(body)
|
||||||
|
except Exception as e:
|
||||||
|
_LOGGER.error("Failed to decompress gzip payload: %s", e)
|
||||||
|
raise HTTPException(status_code=400, detail="Invalid gzip compression")
|
||||||
|
|
||||||
|
try:
|
||||||
|
payload = json.loads(body.decode("utf-8"))
|
||||||
|
except Exception as e:
|
||||||
|
_LOGGER.error("Failed to parse JSON payload: %s", e)
|
||||||
|
raise HTTPException(status_code=400, detail="Invalid JSON payload")
|
||||||
|
|
||||||
|
# 3. Hash payload (canonical JSON for consistent hashing)
|
||||||
|
payload_json_str = json.dumps(payload, sort_keys=True)
|
||||||
|
payload_hash = hashlib.sha256(payload_json_str.encode("utf-8")).hexdigest()
|
||||||
|
payload_size = len(payload_json_str.encode("utf-8"))
|
||||||
|
|
||||||
|
# Check payload size limit (10MB)
|
||||||
|
if payload_size > 10 * 1024 * 1024:
|
||||||
|
_LOGGER.error("Payload too large: %d bytes", payload_size)
|
||||||
|
raise HTTPException(status_code=413, detail="Payload too large (max 10MB)")
|
||||||
|
|
||||||
|
# 4. Check for duplicate with row-level locking
|
||||||
|
duplicate = await db_session.execute(
|
||||||
|
select(WebhookRequest)
|
||||||
|
.where(WebhookRequest.payload_hash == payload_hash)
|
||||||
|
.with_for_update(skip_locked=True)
|
||||||
|
)
|
||||||
|
existing = duplicate.scalar_one_or_none()
|
||||||
|
|
||||||
|
if existing:
|
||||||
|
if existing.status == 'completed':
|
||||||
|
# Already processed successfully
|
||||||
|
_LOGGER.info(
|
||||||
|
"Webhook already processed (webhook_id=%d, hotel=%s)",
|
||||||
|
existing.id,
|
||||||
|
webhook_endpoint.hotel_id
|
||||||
|
)
|
||||||
|
return {
|
||||||
|
"status": "success",
|
||||||
|
"message": "Webhook already processed",
|
||||||
|
"webhook_id": existing.id,
|
||||||
|
"duplicate": True,
|
||||||
|
}
|
||||||
|
elif existing.status == 'processing':
|
||||||
|
# Another worker is processing right now
|
||||||
|
_LOGGER.info(
|
||||||
|
"Webhook is being processed by another worker (webhook_id=%d)",
|
||||||
|
existing.id
|
||||||
|
)
|
||||||
|
return {
|
||||||
|
"status": "success",
|
||||||
|
"message": "Webhook is being processed",
|
||||||
|
"webhook_id": existing.id,
|
||||||
|
"duplicate": True,
|
||||||
|
}
|
||||||
|
elif existing.status == 'failed':
|
||||||
|
# Retry failed webhook
|
||||||
|
_LOGGER.info(
|
||||||
|
"Retrying failed webhook (webhook_id=%d, retry_count=%d)",
|
||||||
|
existing.id,
|
||||||
|
existing.retry_count
|
||||||
|
)
|
||||||
|
webhook_request = existing
|
||||||
|
webhook_request.retry_count += 1
|
||||||
|
webhook_request.status = 'processing'
|
||||||
|
webhook_request.processing_started_at = timestamp
|
||||||
|
else:
|
||||||
|
# 5. Create new webhook_request
|
||||||
|
webhook_request = WebhookRequest(
|
||||||
|
payload_hash=payload_hash,
|
||||||
|
webhook_endpoint_id=webhook_endpoint.id,
|
||||||
|
hotel_id=webhook_endpoint.hotel_id,
|
||||||
|
status='processing',
|
||||||
|
payload_json=payload,
|
||||||
|
payload_size_bytes=payload_size,
|
||||||
|
processing_started_at=timestamp,
|
||||||
|
created_at=timestamp,
|
||||||
|
source_ip=request.client.host if request.client else None,
|
||||||
|
user_agent=request.headers.get("user-agent"),
|
||||||
|
)
|
||||||
|
db_session.add(webhook_request)
|
||||||
|
await db_session.flush()
|
||||||
|
|
||||||
|
try:
|
||||||
|
# 6. Get processor for webhook_type
|
||||||
|
processor = webhook_registry.get_processor(webhook_endpoint.webhook_type)
|
||||||
|
if not processor:
|
||||||
|
raise ValueError(f"No processor for type: {webhook_endpoint.webhook_type}")
|
||||||
|
|
||||||
|
# 7. Process webhook
|
||||||
|
result = await processor.process(
|
||||||
|
payload=payload,
|
||||||
|
webhook_request=webhook_request,
|
||||||
|
hotel=webhook_endpoint.hotel,
|
||||||
|
db_session=db_session,
|
||||||
|
request=request,
|
||||||
|
)
|
||||||
|
|
||||||
|
# 8. Update status
|
||||||
|
webhook_request.status = 'completed'
|
||||||
|
webhook_request.processing_completed_at = datetime.now(UTC)
|
||||||
|
|
||||||
|
await db_session.commit()
|
||||||
|
|
||||||
|
return {
|
||||||
|
**result,
|
||||||
|
"webhook_id": webhook_request.id,
|
||||||
|
"hotel_id": webhook_endpoint.hotel_id,
|
||||||
|
}
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
_LOGGER.exception("Error processing webhook: %s", e)
|
||||||
|
|
||||||
|
webhook_request.status = 'failed'
|
||||||
|
webhook_request.last_error = str(e)[:2000]
|
||||||
|
webhook_request.processing_completed_at = datetime.now(UTC)
|
||||||
|
await db_session.commit()
|
||||||
|
|
||||||
|
raise HTTPException(status_code=500, detail="Error processing webhook")
|
||||||
|
|
||||||
|
|
||||||
@api_router.post("/webhook/wix-form")
|
@api_router.post("/webhook/wix-form")
|
||||||
@webhook_limiter.limit(WEBHOOK_RATE_LIMIT)
|
@webhook_limiter.limit(WEBHOOK_RATE_LIMIT)
|
||||||
async def handle_wix_form(
|
async def handle_wix_form(
|
||||||
|
|||||||
@@ -13,6 +13,7 @@ from sqlalchemy import (
|
|||||||
Double,
|
Double,
|
||||||
ForeignKey,
|
ForeignKey,
|
||||||
ForeignKeyConstraint,
|
ForeignKeyConstraint,
|
||||||
|
Index,
|
||||||
Integer,
|
Integer,
|
||||||
String,
|
String,
|
||||||
)
|
)
|
||||||
@@ -674,3 +675,114 @@ class ConversionRoom(Base):
|
|||||||
|
|
||||||
# Relationships
|
# Relationships
|
||||||
conversion = relationship("Conversion", back_populates="conversion_rooms")
|
conversion = relationship("Conversion", back_populates="conversion_rooms")
|
||||||
|
|
||||||
|
|
||||||
|
class Hotel(Base):
|
||||||
|
"""Hotel configuration (migrated from alpine_bits_auth in config.yaml)."""
|
||||||
|
|
||||||
|
__tablename__ = "hotels"
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True)
|
||||||
|
|
||||||
|
# Core identification
|
||||||
|
hotel_id = Column(String(50), unique=True, nullable=False, index=True)
|
||||||
|
hotel_name = Column(String(200), nullable=False)
|
||||||
|
|
||||||
|
# AlpineBits authentication
|
||||||
|
username = Column(String(100), unique=True, nullable=False, index=True)
|
||||||
|
password_hash = Column(String(200), nullable=False) # bcrypt
|
||||||
|
|
||||||
|
# Advertising accounts
|
||||||
|
meta_account_id = Column(String(50), nullable=True)
|
||||||
|
google_account_id = Column(String(50), nullable=True)
|
||||||
|
|
||||||
|
# Push endpoint (optional)
|
||||||
|
push_endpoint_url = Column(String(500), nullable=True)
|
||||||
|
push_endpoint_token = Column(String(200), nullable=True)
|
||||||
|
push_endpoint_username = Column(String(100), nullable=True)
|
||||||
|
|
||||||
|
# Metadata
|
||||||
|
created_at = Column(DateTime(timezone=True), nullable=False)
|
||||||
|
updated_at = Column(DateTime(timezone=True), nullable=False)
|
||||||
|
is_active = Column(Boolean, default=True, nullable=False, index=True)
|
||||||
|
|
||||||
|
# Relationships
|
||||||
|
webhook_endpoints = relationship("WebhookEndpoint", back_populates="hotel")
|
||||||
|
|
||||||
|
|
||||||
|
class WebhookEndpoint(Base):
|
||||||
|
"""Webhook configurations per hotel (supports multiple webhook types per hotel)."""
|
||||||
|
|
||||||
|
__tablename__ = "webhook_endpoints"
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True)
|
||||||
|
|
||||||
|
# Hotel association
|
||||||
|
hotel_id = Column(String(50), ForeignKey("hotels.hotel_id"), nullable=False, index=True)
|
||||||
|
|
||||||
|
# Webhook configuration
|
||||||
|
webhook_secret = Column(String(64), unique=True, nullable=False, index=True)
|
||||||
|
webhook_type = Column(String(50), nullable=False) # 'wix_form', 'generic', etc.
|
||||||
|
|
||||||
|
# Metadata
|
||||||
|
description = Column(String(200), nullable=True) # Human-readable label
|
||||||
|
is_enabled = Column(Boolean, default=True, nullable=False)
|
||||||
|
created_at = Column(DateTime(timezone=True), nullable=False)
|
||||||
|
|
||||||
|
# Relationships
|
||||||
|
hotel = relationship("Hotel", back_populates="webhook_endpoints")
|
||||||
|
webhook_requests = relationship("WebhookRequest", back_populates="webhook_endpoint")
|
||||||
|
|
||||||
|
__table_args__ = (
|
||||||
|
Index('idx_webhook_endpoint_hotel_type', 'hotel_id', 'webhook_type'),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class WebhookRequest(Base):
|
||||||
|
"""Tracks incoming webhooks for deduplication and retry handling."""
|
||||||
|
|
||||||
|
__tablename__ = "webhook_requests"
|
||||||
|
|
||||||
|
id = Column(Integer, primary_key=True)
|
||||||
|
|
||||||
|
# Request identification
|
||||||
|
payload_hash = Column(String(64), unique=True, nullable=False, index=True) # SHA256
|
||||||
|
webhook_endpoint_id = Column(Integer, ForeignKey("webhook_endpoints.id"), nullable=True, index=True)
|
||||||
|
hotel_id = Column(String(50), ForeignKey("hotels.hotel_id"), nullable=True, index=True)
|
||||||
|
|
||||||
|
# Processing tracking
|
||||||
|
status = Column(String(20), nullable=False, default='pending', index=True)
|
||||||
|
# Status values: 'pending', 'processing', 'completed', 'failed'
|
||||||
|
|
||||||
|
processing_started_at = Column(DateTime(timezone=True), nullable=True)
|
||||||
|
processing_completed_at = Column(DateTime(timezone=True), nullable=True)
|
||||||
|
|
||||||
|
# Retry handling
|
||||||
|
retry_count = Column(Integer, default=0)
|
||||||
|
last_error = Column(String(2000), nullable=True)
|
||||||
|
|
||||||
|
# Payload storage
|
||||||
|
payload_json = Column(JSON, nullable=True) # NULL after purge, kept for retries
|
||||||
|
payload_size_bytes = Column(Integer, nullable=True) # Track original size
|
||||||
|
purged_at = Column(DateTime(timezone=True), nullable=True) # When JSON was purged
|
||||||
|
|
||||||
|
# Metadata
|
||||||
|
created_at = Column(DateTime(timezone=True), nullable=False, index=True)
|
||||||
|
source_ip = Column(String(45), nullable=True)
|
||||||
|
user_agent = Column(String(500), nullable=True)
|
||||||
|
|
||||||
|
# Result tracking
|
||||||
|
created_customer_id = Column(Integer, ForeignKey("customers.id"), nullable=True)
|
||||||
|
created_reservation_id = Column(Integer, ForeignKey("reservations.id"), nullable=True)
|
||||||
|
|
||||||
|
# Relationships
|
||||||
|
webhook_endpoint = relationship("WebhookEndpoint", back_populates="webhook_requests")
|
||||||
|
hotel = relationship("Hotel")
|
||||||
|
customer = relationship("Customer")
|
||||||
|
reservation = relationship("Reservation")
|
||||||
|
|
||||||
|
__table_args__ = (
|
||||||
|
Index('idx_webhook_status_created', 'status', 'created_at'),
|
||||||
|
Index('idx_webhook_hotel_created', 'hotel_id', 'created_at'),
|
||||||
|
Index('idx_webhook_purge_candidate', 'status', 'purged_at', 'created_at'),
|
||||||
|
)
|
||||||
|
|||||||
@@ -251,6 +251,18 @@ async def run_startup_tasks(
|
|||||||
config: Application configuration dictionary
|
config: Application configuration dictionary
|
||||||
engine: SQLAlchemy async engine (optional, for backfill tasks)
|
engine: SQLAlchemy async engine (optional, for backfill tasks)
|
||||||
"""
|
"""
|
||||||
|
# Sync config to database (hotels and webhook endpoints)
|
||||||
|
if config:
|
||||||
|
from .hotel_service import sync_config_to_database
|
||||||
|
async with sessionmaker() as session:
|
||||||
|
stats = await sync_config_to_database(session, config)
|
||||||
|
_LOGGER.info(
|
||||||
|
"Config sync: %d hotels created, %d updated, %d endpoints created",
|
||||||
|
stats["hotels_created"],
|
||||||
|
stats["hotels_updated"],
|
||||||
|
stats["endpoints_created"]
|
||||||
|
)
|
||||||
|
|
||||||
# Hash any existing customers that don't have hashed data
|
# Hash any existing customers that don't have hashed data
|
||||||
async with sessionmaker() as session:
|
async with sessionmaker() as session:
|
||||||
customer_service = CustomerService(session)
|
customer_service = CustomerService(session)
|
||||||
|
|||||||
246
src/alpine_bits_python/hotel_service.py
Normal file
246
src/alpine_bits_python/hotel_service.py
Normal file
@@ -0,0 +1,246 @@
|
|||||||
|
"""Hotel service for managing hotel configuration."""
|
||||||
|
|
||||||
|
import secrets
|
||||||
|
from datetime import UTC, datetime
|
||||||
|
from typing import Any
|
||||||
|
|
||||||
|
import bcrypt
|
||||||
|
from sqlalchemy import and_, select
|
||||||
|
from sqlalchemy.ext.asyncio import AsyncSession
|
||||||
|
from sqlalchemy.orm import joinedload
|
||||||
|
|
||||||
|
from .db import Hotel, WebhookEndpoint
|
||||||
|
from .logging_config import get_logger
|
||||||
|
|
||||||
|
_LOGGER = get_logger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
def hash_password(password: str) -> str:
|
||||||
|
"""Hash password using bcrypt.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
password: Plain text password
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Bcrypt hashed password
|
||||||
|
"""
|
||||||
|
salt = bcrypt.gensalt(rounds=12)
|
||||||
|
return bcrypt.hashpw(password.encode('utf-8'), salt).decode('utf-8')
|
||||||
|
|
||||||
|
|
||||||
|
def verify_password(password: str, password_hash: str) -> bool:
|
||||||
|
"""Verify password against bcrypt hash.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
password: Plain text password
|
||||||
|
password_hash: Bcrypt hash to verify against
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
True if password matches, False otherwise
|
||||||
|
"""
|
||||||
|
return bcrypt.checkpw(
|
||||||
|
password.encode('utf-8'),
|
||||||
|
password_hash.encode('utf-8')
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def generate_webhook_secret() -> str:
|
||||||
|
"""Generate cryptographically secure webhook secret.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
64-character URL-safe random string
|
||||||
|
"""
|
||||||
|
return secrets.token_urlsafe(48) # 48 bytes = 64 URL-safe chars
|
||||||
|
|
||||||
|
|
||||||
|
async def sync_config_to_database(
|
||||||
|
db_session: AsyncSession,
|
||||||
|
config: dict[str, Any]
|
||||||
|
) -> dict[str, int]:
|
||||||
|
"""Sync alpine_bits_auth from config.yaml to database.
|
||||||
|
|
||||||
|
Creates/updates hotels and generates webhook_endpoints if missing.
|
||||||
|
Idempotent - safe to run on every startup.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
db_session: Database session
|
||||||
|
config: Application configuration dict
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Statistics dict with counts of created/updated records
|
||||||
|
"""
|
||||||
|
stats = {"hotels_created": 0, "hotels_updated": 0, "endpoints_created": 0}
|
||||||
|
|
||||||
|
alpine_bits_auth = config.get("alpine_bits_auth", [])
|
||||||
|
if not alpine_bits_auth:
|
||||||
|
_LOGGER.info("No hotels found in alpine_bits_auth config")
|
||||||
|
return stats
|
||||||
|
|
||||||
|
for hotel_config in alpine_bits_auth:
|
||||||
|
hotel_id = hotel_config.get("hotel_id")
|
||||||
|
if not hotel_id:
|
||||||
|
_LOGGER.warning("Skipping hotel config without hotel_id: %s", hotel_config)
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Check if hotel exists
|
||||||
|
result = await db_session.execute(
|
||||||
|
select(Hotel).where(Hotel.hotel_id == hotel_id)
|
||||||
|
)
|
||||||
|
hotel = result.scalar_one_or_none()
|
||||||
|
|
||||||
|
if not hotel:
|
||||||
|
# Create new hotel
|
||||||
|
password_hash = hash_password(hotel_config["password"])
|
||||||
|
|
||||||
|
hotel = Hotel(
|
||||||
|
hotel_id=hotel_id,
|
||||||
|
hotel_name=hotel_config.get("hotel_name", hotel_id),
|
||||||
|
username=hotel_config["username"],
|
||||||
|
password_hash=password_hash,
|
||||||
|
meta_account_id=hotel_config.get("meta_account"),
|
||||||
|
google_account_id=hotel_config.get("google_account"),
|
||||||
|
push_endpoint_url=hotel_config.get("push_endpoint", {}).get("url"),
|
||||||
|
push_endpoint_token=hotel_config.get("push_endpoint", {}).get("token"),
|
||||||
|
push_endpoint_username=hotel_config.get("push_endpoint", {}).get("username"),
|
||||||
|
created_at=datetime.now(UTC),
|
||||||
|
updated_at=datetime.now(UTC),
|
||||||
|
is_active=True,
|
||||||
|
)
|
||||||
|
db_session.add(hotel)
|
||||||
|
await db_session.flush()
|
||||||
|
stats["hotels_created"] += 1
|
||||||
|
_LOGGER.info("Created hotel: %s", hotel_id)
|
||||||
|
else:
|
||||||
|
# Update existing hotel (config may have changed)
|
||||||
|
# Note: We do NOT update password_hash for security reasons
|
||||||
|
hotel.hotel_name = hotel_config.get("hotel_name", hotel_id)
|
||||||
|
hotel.meta_account_id = hotel_config.get("meta_account")
|
||||||
|
hotel.google_account_id = hotel_config.get("google_account")
|
||||||
|
push_endpoint = hotel_config.get("push_endpoint", {})
|
||||||
|
hotel.push_endpoint_url = push_endpoint.get("url")
|
||||||
|
hotel.push_endpoint_token = push_endpoint.get("token")
|
||||||
|
hotel.push_endpoint_username = push_endpoint.get("username")
|
||||||
|
hotel.updated_at = datetime.now(UTC)
|
||||||
|
stats["hotels_updated"] += 1
|
||||||
|
_LOGGER.debug("Updated hotel: %s", hotel_id)
|
||||||
|
|
||||||
|
# Ensure hotel has at least default webhook endpoints
|
||||||
|
result = await db_session.execute(
|
||||||
|
select(WebhookEndpoint).where(WebhookEndpoint.hotel_id == hotel_id)
|
||||||
|
)
|
||||||
|
existing_endpoints = result.scalars().all()
|
||||||
|
|
||||||
|
if not existing_endpoints:
|
||||||
|
# Create default webhook endpoints for backward compatibility
|
||||||
|
for webhook_type in ["wix_form", "generic"]:
|
||||||
|
webhook_secret = generate_webhook_secret()
|
||||||
|
endpoint = WebhookEndpoint(
|
||||||
|
hotel_id=hotel_id,
|
||||||
|
webhook_secret=webhook_secret,
|
||||||
|
webhook_type=webhook_type,
|
||||||
|
description=f"Auto-generated {webhook_type} endpoint",
|
||||||
|
is_enabled=True,
|
||||||
|
created_at=datetime.now(UTC),
|
||||||
|
)
|
||||||
|
db_session.add(endpoint)
|
||||||
|
stats["endpoints_created"] += 1
|
||||||
|
_LOGGER.info(
|
||||||
|
"Created webhook endpoint for hotel %s, type=%s, secret=%s",
|
||||||
|
hotel_id,
|
||||||
|
webhook_type,
|
||||||
|
webhook_secret
|
||||||
|
)
|
||||||
|
|
||||||
|
await db_session.commit()
|
||||||
|
|
||||||
|
_LOGGER.info(
|
||||||
|
"Config sync complete: %d hotels created, %d updated, %d endpoints created",
|
||||||
|
stats["hotels_created"],
|
||||||
|
stats["hotels_updated"],
|
||||||
|
stats["endpoints_created"]
|
||||||
|
)
|
||||||
|
|
||||||
|
return stats
|
||||||
|
|
||||||
|
|
||||||
|
class HotelService:
|
||||||
|
"""Service for hotel configuration access.
|
||||||
|
|
||||||
|
Always reads from database (synced from config at startup).
|
||||||
|
"""
|
||||||
|
|
||||||
|
def __init__(self, db_session: AsyncSession):
|
||||||
|
"""Initialize HotelService.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
db_session: Database session
|
||||||
|
"""
|
||||||
|
self.db_session = db_session
|
||||||
|
|
||||||
|
async def get_hotel_by_id(self, hotel_id: str) -> Hotel | None:
|
||||||
|
"""Get hotel by hotel_id.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
hotel_id: Hotel identifier
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Hotel instance or None if not found
|
||||||
|
"""
|
||||||
|
result = await self.db_session.execute(
|
||||||
|
select(Hotel)
|
||||||
|
.where(
|
||||||
|
and_(
|
||||||
|
Hotel.hotel_id == hotel_id,
|
||||||
|
Hotel.is_active == True
|
||||||
|
)
|
||||||
|
)
|
||||||
|
)
|
||||||
|
return result.scalar_one_or_none()
|
||||||
|
|
||||||
|
async def get_hotel_by_webhook_secret(
|
||||||
|
self,
|
||||||
|
webhook_secret: str
|
||||||
|
) -> tuple[Hotel, WebhookEndpoint] | tuple[None, None]:
|
||||||
|
"""Get hotel and webhook_endpoint by webhook_secret.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
webhook_secret: Webhook secret string
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Tuple of (Hotel, WebhookEndpoint) or (None, None) if not found
|
||||||
|
"""
|
||||||
|
result = await self.db_session.execute(
|
||||||
|
select(WebhookEndpoint)
|
||||||
|
.where(
|
||||||
|
and_(
|
||||||
|
WebhookEndpoint.webhook_secret == webhook_secret,
|
||||||
|
WebhookEndpoint.is_enabled == True
|
||||||
|
)
|
||||||
|
)
|
||||||
|
.options(joinedload(WebhookEndpoint.hotel))
|
||||||
|
)
|
||||||
|
endpoint = result.scalar_one_or_none()
|
||||||
|
|
||||||
|
if endpoint and endpoint.hotel.is_active:
|
||||||
|
return endpoint.hotel, endpoint
|
||||||
|
return None, None
|
||||||
|
|
||||||
|
async def get_hotel_by_username(self, username: str) -> Hotel | None:
|
||||||
|
"""Get hotel by AlpineBits username.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
username: AlpineBits username
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Hotel instance or None if not found
|
||||||
|
"""
|
||||||
|
result = await self.db_session.execute(
|
||||||
|
select(Hotel)
|
||||||
|
.where(
|
||||||
|
and_(
|
||||||
|
Hotel.username == username,
|
||||||
|
Hotel.is_active == True
|
||||||
|
)
|
||||||
|
)
|
||||||
|
)
|
||||||
|
return result.scalar_one_or_none()
|
||||||
433
src/alpine_bits_python/webhook_processor.py
Normal file
433
src/alpine_bits_python/webhook_processor.py
Normal file
@@ -0,0 +1,433 @@
|
|||||||
|
"""Webhook processor interface and implementations."""
|
||||||
|
|
||||||
|
from abc import ABC, abstractmethod
|
||||||
|
from datetime import datetime
|
||||||
|
from typing import Any, Protocol
|
||||||
|
|
||||||
|
from fastapi import HTTPException, Request
|
||||||
|
from sqlalchemy.ext.asyncio import AsyncSession
|
||||||
|
|
||||||
|
from alpine_bits_python.api import _LOGGER, get_advertising_account_ids
|
||||||
|
from alpine_bits_python.auth import generate_unique_id
|
||||||
|
from alpine_bits_python.customer_service import CustomerService
|
||||||
|
from alpine_bits_python.reservation_service import ReservationService
|
||||||
|
from alpine_bits_python.schemas import ReservationData
|
||||||
|
|
||||||
|
from .db import Hotel, WebhookRequest
|
||||||
|
from .logging_config import get_logger
|
||||||
|
|
||||||
|
_LOGGER = get_logger(__name__)
|
||||||
|
|
||||||
|
|
||||||
|
class WebhookProcessorProtocol(Protocol):
|
||||||
|
"""Protocol for webhook processors."""
|
||||||
|
|
||||||
|
@property
|
||||||
|
def webhook_type(self) -> str:
|
||||||
|
"""Return webhook type identifier (e.g., 'wix_form', 'generic')."""
|
||||||
|
...
|
||||||
|
|
||||||
|
async def process(
|
||||||
|
self,
|
||||||
|
payload: dict[str, Any],
|
||||||
|
webhook_request: WebhookRequest,
|
||||||
|
hotel: Hotel,
|
||||||
|
db_session: AsyncSession,
|
||||||
|
request: Request,
|
||||||
|
) -> dict[str, Any]:
|
||||||
|
"""Process webhook payload.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
payload: Parsed webhook payload
|
||||||
|
webhook_request: WebhookRequest database record
|
||||||
|
hotel: Hotel associated with this webhook
|
||||||
|
db_session: Database session
|
||||||
|
request: FastAPI Request object
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Response dict with status, message, customer_id, reservation_id
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
HTTPException on processing errors
|
||||||
|
"""
|
||||||
|
...
|
||||||
|
|
||||||
|
|
||||||
|
class WebhookProcessorRegistry:
|
||||||
|
"""Registry for webhook processors."""
|
||||||
|
|
||||||
|
def __init__(self):
|
||||||
|
"""Initialize the registry."""
|
||||||
|
self._processors: dict[str, WebhookProcessorProtocol] = {}
|
||||||
|
|
||||||
|
def register(self, processor: WebhookProcessorProtocol) -> None:
|
||||||
|
"""Register a webhook processor.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
processor: Processor instance to register
|
||||||
|
"""
|
||||||
|
self._processors[processor.webhook_type] = processor
|
||||||
|
_LOGGER.info("Registered webhook processor: %s", processor.webhook_type)
|
||||||
|
|
||||||
|
def get_processor(self, webhook_type: str) -> WebhookProcessorProtocol | None:
|
||||||
|
"""Get processor for webhook type.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
webhook_type: Type of webhook to process
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Processor instance or None if not found
|
||||||
|
"""
|
||||||
|
return self._processors.get(webhook_type)
|
||||||
|
|
||||||
|
|
||||||
|
class WixFormProcessor:
|
||||||
|
"""Processor for Wix form webhooks."""
|
||||||
|
|
||||||
|
@property
|
||||||
|
def webhook_type(self) -> str:
|
||||||
|
"""Return webhook type identifier."""
|
||||||
|
return "wix_form"
|
||||||
|
|
||||||
|
async def process(
|
||||||
|
self,
|
||||||
|
payload: dict[str, Any],
|
||||||
|
webhook_request: WebhookRequest,
|
||||||
|
hotel: Hotel,
|
||||||
|
db_session: AsyncSession,
|
||||||
|
request: Request,
|
||||||
|
) -> dict[str, Any]:
|
||||||
|
"""Process Wix form webhook payload.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
payload: Parsed webhook payload
|
||||||
|
webhook_request: WebhookRequest database record
|
||||||
|
hotel: Hotel associated with this webhook
|
||||||
|
db_session: Database session
|
||||||
|
request: FastAPI Request object
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Response dict with status and details
|
||||||
|
"""
|
||||||
|
# Import here to avoid circular dependency
|
||||||
|
from .api import process_wix_form_submission
|
||||||
|
|
||||||
|
# Call existing processing function
|
||||||
|
result = await process_wix_form_submission(request, payload, db_session)
|
||||||
|
|
||||||
|
# The existing function doesn't return customer/reservation IDs directly,
|
||||||
|
# but they would be in the database session. We'll need to extract them
|
||||||
|
# from the result or query after the fact. For now, return the result as-is.
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
async def process_generic_webhook_submission(
|
||||||
|
request: Request, data: dict[str, Any], db
|
||||||
|
):
|
||||||
|
"""Process generic webhook submissions with nested structure.
|
||||||
|
|
||||||
|
Expected structure:
|
||||||
|
{
|
||||||
|
"hotel_data": {"hotelname": "...", "hotelcode": "..."},
|
||||||
|
"form_data": {
|
||||||
|
"sprache": "de/it/en",
|
||||||
|
"anreise": "DD.MM.YYYY",
|
||||||
|
"abreise": "DD.MM.YYYY",
|
||||||
|
"erwachsene": "N",
|
||||||
|
"kinder": "N",
|
||||||
|
"alter": {"1": "age1", "2": "age2", ...},
|
||||||
|
"anrede": "...",
|
||||||
|
"name": "...",
|
||||||
|
"nachname": "...",
|
||||||
|
"mail": "...",
|
||||||
|
"tel": "...",
|
||||||
|
"nachricht": "..."
|
||||||
|
},
|
||||||
|
"tracking_data": {
|
||||||
|
"utm_source": "...",
|
||||||
|
"utm_medium": "...",
|
||||||
|
"utm_campaign": "...",
|
||||||
|
"utm_content": "...",
|
||||||
|
"utm_term": "...",
|
||||||
|
"fbclid": "...",
|
||||||
|
"gclid": "..."
|
||||||
|
},
|
||||||
|
"timestamp": "ISO8601"
|
||||||
|
}
|
||||||
|
"""
|
||||||
|
timestamp = datetime.now().isoformat()
|
||||||
|
_LOGGER.info("Processing generic webhook submission at %s", timestamp)
|
||||||
|
|
||||||
|
# Extract nested data
|
||||||
|
hotel_data = data.get("hotel_data", {})
|
||||||
|
form_data = data.get("form_data", {})
|
||||||
|
tracking_data = data.get("tracking_data", {})
|
||||||
|
offer_data = data.get("unterkunftTyp", {})
|
||||||
|
|
||||||
|
selected_offers = []
|
||||||
|
|
||||||
|
if offer_data:
|
||||||
|
# grab keys and values. If value is "on" add the key not the value to a list of selected offers
|
||||||
|
|
||||||
|
offer_data: dict[str, str]
|
||||||
|
|
||||||
|
for key, value in offer_data.items():
|
||||||
|
if value == "on":
|
||||||
|
selected_offers.append(key)
|
||||||
|
|
||||||
|
selected_offers_str = ", ".join(selected_offers) if selected_offers else None
|
||||||
|
|
||||||
|
# Extract hotel information
|
||||||
|
hotel_code = hotel_data.get("hotelcode")
|
||||||
|
hotel_name = hotel_data.get("hotelname")
|
||||||
|
|
||||||
|
if not hotel_code:
|
||||||
|
_LOGGER.warning("No hotel_code provided in webhook data, using default")
|
||||||
|
hotel_code = request.app.state.config.get("default_hotel_code", "123")
|
||||||
|
|
||||||
|
if not hotel_name:
|
||||||
|
hotel_name = (
|
||||||
|
request.app.state.config.get("default_hotel_name") or "Frangart Inn"
|
||||||
|
)
|
||||||
|
|
||||||
|
# Extract customer information
|
||||||
|
first_name = form_data.get("name")
|
||||||
|
last_name = form_data.get("nachname")
|
||||||
|
email = form_data.get("mail")
|
||||||
|
phone_number = form_data.get("tel")
|
||||||
|
name_prefix = form_data.get("anrede")
|
||||||
|
language = form_data.get("sprache", "de")[:2]
|
||||||
|
user_comment = form_data.get("nachricht", "")
|
||||||
|
plz = form_data.get("plz", "")
|
||||||
|
city = form_data.get("stadt", "")
|
||||||
|
country = form_data.get("land", "")
|
||||||
|
|
||||||
|
# Parse dates - handle DD.MM.YYYY format
|
||||||
|
start_date_str = form_data.get("anreise")
|
||||||
|
end_date_str = form_data.get("abreise")
|
||||||
|
|
||||||
|
if not start_date_str or not end_date_str:
|
||||||
|
raise HTTPException(
|
||||||
|
status_code=400, detail="Missing required dates (anreise/abreise)"
|
||||||
|
)
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Parse DD.MM.YYYY format using strptime
|
||||||
|
start_date = datetime.strptime(start_date_str, "%d.%m.%Y").date()
|
||||||
|
end_date = datetime.strptime(end_date_str, "%d.%m.%Y").date()
|
||||||
|
except ValueError as e:
|
||||||
|
_LOGGER.error(
|
||||||
|
"Error parsing dates: start=%s, end=%s, error=%s",
|
||||||
|
start_date_str,
|
||||||
|
end_date_str,
|
||||||
|
e,
|
||||||
|
)
|
||||||
|
raise HTTPException(status_code=400, detail=f"Invalid date format: {e}") from e
|
||||||
|
|
||||||
|
# Extract room/guest info
|
||||||
|
num_adults = int(form_data.get("erwachsene", 2))
|
||||||
|
num_children = int(form_data.get("kinder", 0))
|
||||||
|
|
||||||
|
# Extract children ages from nested structure
|
||||||
|
children_ages = []
|
||||||
|
if num_children > 0:
|
||||||
|
alter_data = form_data.get("alter", {})
|
||||||
|
for i in range(1, num_children + 1):
|
||||||
|
age_str = alter_data.get(str(i))
|
||||||
|
if age_str:
|
||||||
|
try:
|
||||||
|
children_ages.append(int(age_str))
|
||||||
|
except ValueError:
|
||||||
|
_LOGGER.warning("Invalid age value for child %d: %s", i, age_str)
|
||||||
|
|
||||||
|
# Extract tracking information
|
||||||
|
utm_source = None
|
||||||
|
utm_medium = None
|
||||||
|
utm_campaign = None
|
||||||
|
utm_term = None
|
||||||
|
utm_content = None
|
||||||
|
fbclid = None
|
||||||
|
gclid = None
|
||||||
|
|
||||||
|
if tracking_data:
|
||||||
|
utm_source = tracking_data.get("utm_source")
|
||||||
|
utm_medium = tracking_data.get("utm_medium")
|
||||||
|
utm_campaign = tracking_data.get("utm_campaign")
|
||||||
|
utm_term = tracking_data.get("utm_term")
|
||||||
|
utm_content = tracking_data.get("utm_content")
|
||||||
|
fbclid = tracking_data.get("fbclid")
|
||||||
|
gclid = tracking_data.get("gclid")
|
||||||
|
|
||||||
|
# Parse submission timestamp
|
||||||
|
submission_time = data.get("timestamp")
|
||||||
|
try:
|
||||||
|
if submission_time:
|
||||||
|
# Handle ISO8601 format with timezone
|
||||||
|
if submission_time.endswith("Z"):
|
||||||
|
submission_time = datetime.fromisoformat(submission_time[:-1])
|
||||||
|
elif "+" in submission_time:
|
||||||
|
# Remove timezone info (e.g., +02:00)
|
||||||
|
submission_time = datetime.fromisoformat(submission_time.split("+")[0])
|
||||||
|
else:
|
||||||
|
submission_time = datetime.fromisoformat(submission_time)
|
||||||
|
except Exception as e:
|
||||||
|
_LOGGER.exception("Error parsing submission timestamp: %s", e)
|
||||||
|
submission_time = None
|
||||||
|
|
||||||
|
# Generate unique ID
|
||||||
|
unique_id = generate_unique_id()
|
||||||
|
|
||||||
|
# Use CustomerService to handle customer creation/update with hashing
|
||||||
|
customer_service = CustomerService(db)
|
||||||
|
|
||||||
|
customer_data = {
|
||||||
|
"given_name": first_name,
|
||||||
|
"surname": last_name,
|
||||||
|
"contact_id": None,
|
||||||
|
"name_prefix": name_prefix if name_prefix != "--" else None,
|
||||||
|
"email_address": email,
|
||||||
|
"phone": phone_number if phone_number else None,
|
||||||
|
"email_newsletter": False,
|
||||||
|
"address_line": None,
|
||||||
|
"city_name": city if city else None,
|
||||||
|
"postal_code": plz if plz else None,
|
||||||
|
"country_code": country if country else None,
|
||||||
|
"gender": None,
|
||||||
|
"birth_date": None,
|
||||||
|
"language": language,
|
||||||
|
"address_catalog": False,
|
||||||
|
"name_title": None,
|
||||||
|
}
|
||||||
|
|
||||||
|
# Create/update customer
|
||||||
|
db_customer = await customer_service.get_or_create_customer(customer_data)
|
||||||
|
|
||||||
|
# Get advertising account IDs conditionally based on fbclid/gclid presence
|
||||||
|
meta_account_id, google_account_id = get_advertising_account_ids(
|
||||||
|
request.app.state.config, hotel_code, fbclid, gclid
|
||||||
|
)
|
||||||
|
|
||||||
|
# Create reservation
|
||||||
|
reservation_kwargs = {
|
||||||
|
"unique_id": unique_id,
|
||||||
|
"start_date": start_date,
|
||||||
|
"end_date": end_date,
|
||||||
|
"num_adults": num_adults,
|
||||||
|
"num_children": num_children,
|
||||||
|
"children_ages": children_ages,
|
||||||
|
"hotel_code": hotel_code,
|
||||||
|
"hotel_name": hotel_name,
|
||||||
|
"offer": selected_offers_str,
|
||||||
|
"utm_source": utm_source,
|
||||||
|
"utm_medium": utm_medium,
|
||||||
|
"utm_campaign": utm_campaign,
|
||||||
|
"utm_term": utm_term,
|
||||||
|
"utm_content": utm_content,
|
||||||
|
"user_comment": user_comment,
|
||||||
|
"fbclid": fbclid,
|
||||||
|
"gclid": gclid,
|
||||||
|
"meta_account_id": meta_account_id,
|
||||||
|
"google_account_id": google_account_id,
|
||||||
|
}
|
||||||
|
|
||||||
|
# Only include created_at if we have a valid submission_time
|
||||||
|
if submission_time:
|
||||||
|
reservation_kwargs["created_at"] = submission_time
|
||||||
|
|
||||||
|
reservation = ReservationData(**reservation_kwargs)
|
||||||
|
|
||||||
|
if reservation.md5_unique_id is None:
|
||||||
|
raise HTTPException(status_code=400, detail="Failed to generate md5_unique_id")
|
||||||
|
|
||||||
|
# Use ReservationService to create reservation
|
||||||
|
reservation_service = ReservationService(db)
|
||||||
|
db_reservation = await reservation_service.create_reservation(
|
||||||
|
reservation, db_customer.id
|
||||||
|
)
|
||||||
|
|
||||||
|
async def push_event():
|
||||||
|
# Fire event for listeners (push, etc.) - hotel-specific dispatch
|
||||||
|
dispatcher = getattr(request.app.state, "event_dispatcher", None)
|
||||||
|
if dispatcher:
|
||||||
|
# Get hotel_code from reservation to target the right listeners
|
||||||
|
hotel_code = getattr(db_reservation, "hotel_code", None)
|
||||||
|
if hotel_code and hotel_code.strip():
|
||||||
|
await dispatcher.dispatch_for_hotel(
|
||||||
|
"form_processed", hotel_code, db_customer, db_reservation
|
||||||
|
)
|
||||||
|
_LOGGER.info("Dispatched form_processed event for hotel %s", hotel_code)
|
||||||
|
else:
|
||||||
|
_LOGGER.warning(
|
||||||
|
"No hotel_code in reservation, skipping push notifications"
|
||||||
|
)
|
||||||
|
|
||||||
|
# Create task and store reference to prevent garbage collection
|
||||||
|
task = asyncio.create_task(push_event())
|
||||||
|
# Add done callback to log any exceptions
|
||||||
|
task.add_done_callback(lambda t: t.exception() if not t.cancelled() else None)
|
||||||
|
|
||||||
|
_LOGGER.info(
|
||||||
|
"Successfully processed generic webhook: customer_id=%s, reservation_id=%s",
|
||||||
|
db_customer.id,
|
||||||
|
db_reservation.id,
|
||||||
|
)
|
||||||
|
|
||||||
|
return {
|
||||||
|
"status": "success",
|
||||||
|
"message": "Generic webhook data processed successfully",
|
||||||
|
"customer_id": db_customer.id,
|
||||||
|
"reservation_id": db_reservation.id,
|
||||||
|
"timestamp": timestamp,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
class GenericWebhookProcessor:
|
||||||
|
"""Processor for generic webhooks."""
|
||||||
|
|
||||||
|
@property
|
||||||
|
def webhook_type(self) -> str:
|
||||||
|
"""Return webhook type identifier."""
|
||||||
|
return "generic"
|
||||||
|
|
||||||
|
async def process(
|
||||||
|
self,
|
||||||
|
payload: dict[str, Any],
|
||||||
|
webhook_request: WebhookRequest,
|
||||||
|
hotel: Hotel,
|
||||||
|
db_session: AsyncSession,
|
||||||
|
request: Request,
|
||||||
|
) -> dict[str, Any]:
|
||||||
|
"""Process generic webhook payload.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
payload: Parsed webhook payload
|
||||||
|
webhook_request: WebhookRequest database record
|
||||||
|
hotel: Hotel associated with this webhook
|
||||||
|
db_session: Database session
|
||||||
|
request: FastAPI Request object
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Response dict with status and details
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
# Call existing processing function
|
||||||
|
result = await process_generic_webhook_submission(request, payload, db_session)
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
# Global registry instance
|
||||||
|
webhook_registry = WebhookProcessorRegistry()
|
||||||
|
|
||||||
|
|
||||||
|
def initialize_webhook_processors() -> None:
|
||||||
|
"""Initialize and register all webhook processors.
|
||||||
|
|
||||||
|
This should be called during application startup.
|
||||||
|
"""
|
||||||
|
# Register built-in processors
|
||||||
|
webhook_registry.register(WixFormProcessor())
|
||||||
|
webhook_registry.register(GenericWebhookProcessor())
|
||||||
|
|
||||||
|
_LOGGER.info("Webhook processors initialized")
|
||||||
68
uv.lock
generated
68
uv.lock
generated
@@ -37,6 +37,7 @@ dependencies = [
|
|||||||
{ name = "alembic" },
|
{ name = "alembic" },
|
||||||
{ name = "annotatedyaml" },
|
{ name = "annotatedyaml" },
|
||||||
{ name = "asyncpg" },
|
{ name = "asyncpg" },
|
||||||
|
{ name = "bcrypt" },
|
||||||
{ name = "dotenv" },
|
{ name = "dotenv" },
|
||||||
{ name = "fast-langdetect" },
|
{ name = "fast-langdetect" },
|
||||||
{ name = "fastapi" },
|
{ name = "fastapi" },
|
||||||
@@ -69,6 +70,7 @@ requires-dist = [
|
|||||||
{ name = "alembic", specifier = ">=1.17.2" },
|
{ name = "alembic", specifier = ">=1.17.2" },
|
||||||
{ name = "annotatedyaml", specifier = ">=1.0.0" },
|
{ name = "annotatedyaml", specifier = ">=1.0.0" },
|
||||||
{ name = "asyncpg", specifier = ">=0.30.0" },
|
{ name = "asyncpg", specifier = ">=0.30.0" },
|
||||||
|
{ name = "bcrypt", specifier = ">=5.0.0" },
|
||||||
{ name = "dotenv", specifier = ">=0.9.9" },
|
{ name = "dotenv", specifier = ">=0.9.9" },
|
||||||
{ name = "fast-langdetect", specifier = ">=1.0.0" },
|
{ name = "fast-langdetect", specifier = ">=1.0.0" },
|
||||||
{ name = "fastapi", specifier = ">=0.117.1" },
|
{ name = "fastapi", specifier = ">=0.117.1" },
|
||||||
@@ -171,6 +173,72 @@ wheels = [
|
|||||||
{ url = "https://files.pythonhosted.org/packages/c8/a4/cec76b3389c4c5ff66301cd100fe88c318563ec8a520e0b2e792b5b84972/asyncpg-0.30.0-cp313-cp313-win_amd64.whl", hash = "sha256:f59b430b8e27557c3fb9869222559f7417ced18688375825f8f12302c34e915e", size = 621623, upload-time = "2024-10-20T00:30:09.024Z" },
|
{ url = "https://files.pythonhosted.org/packages/c8/a4/cec76b3389c4c5ff66301cd100fe88c318563ec8a520e0b2e792b5b84972/asyncpg-0.30.0-cp313-cp313-win_amd64.whl", hash = "sha256:f59b430b8e27557c3fb9869222559f7417ced18688375825f8f12302c34e915e", size = 621623, upload-time = "2024-10-20T00:30:09.024Z" },
|
||||||
]
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "bcrypt"
|
||||||
|
version = "5.0.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/d4/36/3329e2518d70ad8e2e5817d5a4cac6bba05a47767ec416c7d020a965f408/bcrypt-5.0.0.tar.gz", hash = "sha256:f748f7c2d6fd375cc93d3fba7ef4a9e3a092421b8dbf34d8d4dc06be9492dfdd", size = 25386, upload-time = "2025-09-25T19:50:47.829Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/13/85/3e65e01985fddf25b64ca67275bb5bdb4040bd1a53b66d355c6c37c8a680/bcrypt-5.0.0-cp313-cp313t-macosx_10_12_universal2.whl", hash = "sha256:f3c08197f3039bec79cee59a606d62b96b16669cff3949f21e74796b6e3cd2be", size = 481806, upload-time = "2025-09-25T19:49:05.102Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/44/dc/01eb79f12b177017a726cbf78330eb0eb442fae0e7b3dfd84ea2849552f3/bcrypt-5.0.0-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:200af71bc25f22006f4069060c88ed36f8aa4ff7f53e67ff04d2ab3f1e79a5b2", size = 268626, upload-time = "2025-09-25T19:49:06.723Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/8c/cf/e82388ad5959c40d6afd94fb4743cc077129d45b952d46bdc3180310e2df/bcrypt-5.0.0-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:baade0a5657654c2984468efb7d6c110db87ea63ef5a4b54732e7e337253e44f", size = 271853, upload-time = "2025-09-25T19:49:08.028Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ec/86/7134b9dae7cf0efa85671651341f6afa695857fae172615e960fb6a466fa/bcrypt-5.0.0-cp313-cp313t-manylinux_2_28_aarch64.whl", hash = "sha256:c58b56cdfb03202b3bcc9fd8daee8e8e9b6d7e3163aa97c631dfcfcc24d36c86", size = 269793, upload-time = "2025-09-25T19:49:09.727Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/cc/82/6296688ac1b9e503d034e7d0614d56e80c5d1a08402ff856a4549cb59207/bcrypt-5.0.0-cp313-cp313t-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:4bfd2a34de661f34d0bda43c3e4e79df586e4716ef401fe31ea39d69d581ef23", size = 289930, upload-time = "2025-09-25T19:49:11.204Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d1/18/884a44aa47f2a3b88dd09bc05a1e40b57878ecd111d17e5bba6f09f8bb77/bcrypt-5.0.0-cp313-cp313t-manylinux_2_28_x86_64.whl", hash = "sha256:ed2e1365e31fc73f1825fa830f1c8f8917ca1b3ca6185773b349c20fd606cec2", size = 272194, upload-time = "2025-09-25T19:49:12.524Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/0e/8f/371a3ab33c6982070b674f1788e05b656cfbf5685894acbfef0c65483a59/bcrypt-5.0.0-cp313-cp313t-manylinux_2_34_aarch64.whl", hash = "sha256:83e787d7a84dbbfba6f250dd7a5efd689e935f03dd83b0f919d39349e1f23f83", size = 269381, upload-time = "2025-09-25T19:49:14.308Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b1/34/7e4e6abb7a8778db6422e88b1f06eb07c47682313997ee8a8f9352e5a6f1/bcrypt-5.0.0-cp313-cp313t-manylinux_2_34_x86_64.whl", hash = "sha256:137c5156524328a24b9fac1cb5db0ba618bc97d11970b39184c1d87dc4bf1746", size = 271750, upload-time = "2025-09-25T19:49:15.584Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c0/1b/54f416be2499bd72123c70d98d36c6cd61a4e33d9b89562c22481c81bb30/bcrypt-5.0.0-cp313-cp313t-musllinux_1_1_aarch64.whl", hash = "sha256:38cac74101777a6a7d3b3e3cfefa57089b5ada650dce2baf0cbdd9d65db22a9e", size = 303757, upload-time = "2025-09-25T19:49:17.244Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/13/62/062c24c7bcf9d2826a1a843d0d605c65a755bc98002923d01fd61270705a/bcrypt-5.0.0-cp313-cp313t-musllinux_1_1_x86_64.whl", hash = "sha256:d8d65b564ec849643d9f7ea05c6d9f0cd7ca23bdd4ac0c2dbef1104ab504543d", size = 306740, upload-time = "2025-09-25T19:49:18.693Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d5/c8/1fdbfc8c0f20875b6b4020f3c7dc447b8de60aa0be5faaf009d24242aec9/bcrypt-5.0.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:741449132f64b3524e95cd30e5cd3343006ce146088f074f31ab26b94e6c75ba", size = 334197, upload-time = "2025-09-25T19:49:20.523Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a6/c1/8b84545382d75bef226fbc6588af0f7b7d095f7cd6a670b42a86243183cd/bcrypt-5.0.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:212139484ab3207b1f0c00633d3be92fef3c5f0af17cad155679d03ff2ee1e41", size = 352974, upload-time = "2025-09-25T19:49:22.254Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/10/a6/ffb49d4254ed085e62e3e5dd05982b4393e32fe1e49bb1130186617c29cd/bcrypt-5.0.0-cp313-cp313t-win32.whl", hash = "sha256:9d52ed507c2488eddd6a95bccee4e808d3234fa78dd370e24bac65a21212b861", size = 148498, upload-time = "2025-09-25T19:49:24.134Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/48/a9/259559edc85258b6d5fc5471a62a3299a6aa37a6611a169756bf4689323c/bcrypt-5.0.0-cp313-cp313t-win_amd64.whl", hash = "sha256:f6984a24db30548fd39a44360532898c33528b74aedf81c26cf29c51ee47057e", size = 145853, upload-time = "2025-09-25T19:49:25.702Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/2d/df/9714173403c7e8b245acf8e4be8876aac64a209d1b392af457c79e60492e/bcrypt-5.0.0-cp313-cp313t-win_arm64.whl", hash = "sha256:9fffdb387abe6aa775af36ef16f55e318dcda4194ddbf82007a6f21da29de8f5", size = 139626, upload-time = "2025-09-25T19:49:26.928Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f8/14/c18006f91816606a4abe294ccc5d1e6f0e42304df5a33710e9e8e95416e1/bcrypt-5.0.0-cp314-cp314t-macosx_10_12_universal2.whl", hash = "sha256:4870a52610537037adb382444fefd3706d96d663ac44cbb2f37e3919dca3d7ef", size = 481862, upload-time = "2025-09-25T19:49:28.365Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/67/49/dd074d831f00e589537e07a0725cf0e220d1f0d5d8e85ad5bbff251c45aa/bcrypt-5.0.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:48f753100931605686f74e27a7b49238122aa761a9aefe9373265b8b7aa43ea4", size = 268544, upload-time = "2025-09-25T19:49:30.39Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f5/91/50ccba088b8c474545b034a1424d05195d9fcbaaf802ab8bfe2be5a4e0d7/bcrypt-5.0.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:f70aadb7a809305226daedf75d90379c397b094755a710d7014b8b117df1ebbf", size = 271787, upload-time = "2025-09-25T19:49:32.144Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/aa/e7/d7dba133e02abcda3b52087a7eea8c0d4f64d3e593b4fffc10c31b7061f3/bcrypt-5.0.0-cp314-cp314t-manylinux_2_28_aarch64.whl", hash = "sha256:744d3c6b164caa658adcb72cb8cc9ad9b4b75c7db507ab4bc2480474a51989da", size = 269753, upload-time = "2025-09-25T19:49:33.885Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/33/fc/5b145673c4b8d01018307b5c2c1fc87a6f5a436f0ad56607aee389de8ee3/bcrypt-5.0.0-cp314-cp314t-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:a28bc05039bdf3289d757f49d616ab3efe8cf40d8e8001ccdd621cd4f98f4fc9", size = 289587, upload-time = "2025-09-25T19:49:35.144Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/27/d7/1ff22703ec6d4f90e62f1a5654b8867ef96bafb8e8102c2288333e1a6ca6/bcrypt-5.0.0-cp314-cp314t-manylinux_2_28_x86_64.whl", hash = "sha256:7f277a4b3390ab4bebe597800a90da0edae882c6196d3038a73adf446c4f969f", size = 272178, upload-time = "2025-09-25T19:49:36.793Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c8/88/815b6d558a1e4d40ece04a2f84865b0fef233513bd85fd0e40c294272d62/bcrypt-5.0.0-cp314-cp314t-manylinux_2_34_aarch64.whl", hash = "sha256:79cfa161eda8d2ddf29acad370356b47f02387153b11d46042e93a0a95127493", size = 269295, upload-time = "2025-09-25T19:49:38.164Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/51/8c/e0db387c79ab4931fc89827d37608c31cc57b6edc08ccd2386139028dc0d/bcrypt-5.0.0-cp314-cp314t-manylinux_2_34_x86_64.whl", hash = "sha256:a5393eae5722bcef046a990b84dff02b954904c36a194f6cfc817d7dca6c6f0b", size = 271700, upload-time = "2025-09-25T19:49:39.917Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/06/83/1570edddd150f572dbe9fc00f6203a89fc7d4226821f67328a85c330f239/bcrypt-5.0.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:7f4c94dec1b5ab5d522750cb059bb9409ea8872d4494fd152b53cca99f1ddd8c", size = 334034, upload-time = "2025-09-25T19:49:41.227Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c9/f2/ea64e51a65e56ae7a8a4ec236c2bfbdd4b23008abd50ac33fbb2d1d15424/bcrypt-5.0.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:0cae4cb350934dfd74c020525eeae0a5f79257e8a201c0c176f4b84fdbf2a4b4", size = 352766, upload-time = "2025-09-25T19:49:43.08Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d7/d4/1a388d21ee66876f27d1a1f41287897d0c0f1712ef97d395d708ba93004c/bcrypt-5.0.0-cp314-cp314t-win32.whl", hash = "sha256:b17366316c654e1ad0306a6858e189fc835eca39f7eb2cafd6aaca8ce0c40a2e", size = 152449, upload-time = "2025-09-25T19:49:44.971Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/3f/61/3291c2243ae0229e5bca5d19f4032cecad5dfb05a2557169d3a69dc0ba91/bcrypt-5.0.0-cp314-cp314t-win_amd64.whl", hash = "sha256:92864f54fb48b4c718fc92a32825d0e42265a627f956bc0361fe869f1adc3e7d", size = 149310, upload-time = "2025-09-25T19:49:46.162Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/3e/89/4b01c52ae0c1a681d4021e5dd3e45b111a8fb47254a274fa9a378d8d834b/bcrypt-5.0.0-cp314-cp314t-win_arm64.whl", hash = "sha256:dd19cf5184a90c873009244586396a6a884d591a5323f0e8a5922560718d4993", size = 143761, upload-time = "2025-09-25T19:49:47.345Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/84/29/6237f151fbfe295fe3e074ecc6d44228faa1e842a81f6d34a02937ee1736/bcrypt-5.0.0-cp38-abi3-macosx_10_12_universal2.whl", hash = "sha256:fc746432b951e92b58317af8e0ca746efe93e66555f1b40888865ef5bf56446b", size = 494553, upload-time = "2025-09-25T19:49:49.006Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/45/b6/4c1205dde5e464ea3bd88e8742e19f899c16fa8916fb8510a851fae985b5/bcrypt-5.0.0-cp38-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:c2388ca94ffee269b6038d48747f4ce8df0ffbea43f31abfa18ac72f0218effb", size = 275009, upload-time = "2025-09-25T19:49:50.581Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/3b/71/427945e6ead72ccffe77894b2655b695ccf14ae1866cd977e185d606dd2f/bcrypt-5.0.0-cp38-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:560ddb6ec730386e7b3b26b8b4c88197aaed924430e7b74666a586ac997249ef", size = 278029, upload-time = "2025-09-25T19:49:52.533Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/17/72/c344825e3b83c5389a369c8a8e58ffe1480b8a699f46c127c34580c4666b/bcrypt-5.0.0-cp38-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:d79e5c65dcc9af213594d6f7f1fa2c98ad3fc10431e7aa53c176b441943efbdd", size = 275907, upload-time = "2025-09-25T19:49:54.709Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/0b/7e/d4e47d2df1641a36d1212e5c0514f5291e1a956a7749f1e595c07a972038/bcrypt-5.0.0-cp38-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:2b732e7d388fa22d48920baa267ba5d97cca38070b69c0e2d37087b381c681fd", size = 296500, upload-time = "2025-09-25T19:49:56.013Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/0f/c3/0ae57a68be2039287ec28bc463b82e4b8dc23f9d12c0be331f4782e19108/bcrypt-5.0.0-cp38-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:0c8e093ea2532601a6f686edbc2c6b2ec24131ff5c52f7610dd64fa4553b5464", size = 278412, upload-time = "2025-09-25T19:49:57.356Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/45/2b/77424511adb11e6a99e3a00dcc7745034bee89036ad7d7e255a7e47be7d8/bcrypt-5.0.0-cp38-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:5b1589f4839a0899c146e8892efe320c0fa096568abd9b95593efac50a87cb75", size = 275486, upload-time = "2025-09-25T19:49:59.116Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/43/0a/405c753f6158e0f3f14b00b462d8bca31296f7ecfc8fc8bc7919c0c7d73a/bcrypt-5.0.0-cp38-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:89042e61b5e808b67daf24a434d89bab164d4de1746b37a8d173b6b14f3db9ff", size = 277940, upload-time = "2025-09-25T19:50:00.869Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/62/83/b3efc285d4aadc1fa83db385ec64dcfa1707e890eb42f03b127d66ac1b7b/bcrypt-5.0.0-cp38-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:e3cf5b2560c7b5a142286f69bde914494b6d8f901aaa71e453078388a50881c4", size = 310776, upload-time = "2025-09-25T19:50:02.393Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/95/7d/47ee337dacecde6d234890fe929936cb03ebc4c3a7460854bbd9c97780b8/bcrypt-5.0.0-cp38-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:f632fd56fc4e61564f78b46a2269153122db34988e78b6be8b32d28507b7eaeb", size = 312922, upload-time = "2025-09-25T19:50:04.232Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d6/3a/43d494dfb728f55f4e1cf8fd435d50c16a2d75493225b54c8d06122523c6/bcrypt-5.0.0-cp38-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:801cad5ccb6b87d1b430f183269b94c24f248dddbbc5c1f78b6ed231743e001c", size = 341367, upload-time = "2025-09-25T19:50:05.559Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/55/ab/a0727a4547e383e2e22a630e0f908113db37904f58719dc48d4622139b5c/bcrypt-5.0.0-cp38-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:3cf67a804fc66fc217e6914a5635000259fbbbb12e78a99488e4d5ba445a71eb", size = 359187, upload-time = "2025-09-25T19:50:06.916Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/1b/bb/461f352fdca663524b4643d8b09e8435b4990f17fbf4fea6bc2a90aa0cc7/bcrypt-5.0.0-cp38-abi3-win32.whl", hash = "sha256:3abeb543874b2c0524ff40c57a4e14e5d3a66ff33fb423529c88f180fd756538", size = 153752, upload-time = "2025-09-25T19:50:08.515Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/41/aa/4190e60921927b7056820291f56fc57d00d04757c8b316b2d3c0d1d6da2c/bcrypt-5.0.0-cp38-abi3-win_amd64.whl", hash = "sha256:35a77ec55b541e5e583eb3436ffbbf53b0ffa1fa16ca6782279daf95d146dcd9", size = 150881, upload-time = "2025-09-25T19:50:09.742Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/54/12/cd77221719d0b39ac0b55dbd39358db1cd1246e0282e104366ebbfb8266a/bcrypt-5.0.0-cp38-abi3-win_arm64.whl", hash = "sha256:cde08734f12c6a4e28dc6755cd11d3bdfea608d93d958fffbe95a7026ebe4980", size = 144931, upload-time = "2025-09-25T19:50:11.016Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/5d/ba/2af136406e1c3839aea9ecadc2f6be2bcd1eff255bd451dd39bcf302c47a/bcrypt-5.0.0-cp39-abi3-macosx_10_12_universal2.whl", hash = "sha256:0c418ca99fd47e9c59a301744d63328f17798b5947b0f791e9af3c1c499c2d0a", size = 495313, upload-time = "2025-09-25T19:50:12.309Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ac/ee/2f4985dbad090ace5ad1f7dd8ff94477fe089b5fab2040bd784a3d5f187b/bcrypt-5.0.0-cp39-abi3-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:ddb4e1500f6efdd402218ffe34d040a1196c072e07929b9820f363a1fd1f4191", size = 275290, upload-time = "2025-09-25T19:50:13.673Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e4/6e/b77ade812672d15cf50842e167eead80ac3514f3beacac8902915417f8b7/bcrypt-5.0.0-cp39-abi3-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:7aeef54b60ceddb6f30ee3db090351ecf0d40ec6e2abf41430997407a46d2254", size = 278253, upload-time = "2025-09-25T19:50:15.089Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/36/c4/ed00ed32f1040f7990dac7115f82273e3c03da1e1a1587a778d8cea496d8/bcrypt-5.0.0-cp39-abi3-manylinux_2_28_aarch64.whl", hash = "sha256:f0ce778135f60799d89c9693b9b398819d15f1921ba15fe719acb3178215a7db", size = 276084, upload-time = "2025-09-25T19:50:16.699Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e7/c4/fa6e16145e145e87f1fa351bbd54b429354fd72145cd3d4e0c5157cf4c70/bcrypt-5.0.0-cp39-abi3-manylinux_2_28_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:a71f70ee269671460b37a449f5ff26982a6f2ba493b3eabdd687b4bf35f875ac", size = 297185, upload-time = "2025-09-25T19:50:18.525Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/24/b4/11f8a31d8b67cca3371e046db49baa7c0594d71eb40ac8121e2fc0888db0/bcrypt-5.0.0-cp39-abi3-manylinux_2_28_x86_64.whl", hash = "sha256:f8429e1c410b4073944f03bd778a9e066e7fad723564a52ff91841d278dfc822", size = 278656, upload-time = "2025-09-25T19:50:19.809Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ac/31/79f11865f8078e192847d2cb526e3fa27c200933c982c5b2869720fa5fce/bcrypt-5.0.0-cp39-abi3-manylinux_2_34_aarch64.whl", hash = "sha256:edfcdcedd0d0f05850c52ba3127b1fce70b9f89e0fe5ff16517df7e81fa3cbb8", size = 275662, upload-time = "2025-09-25T19:50:21.567Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d4/8d/5e43d9584b3b3591a6f9b68f755a4da879a59712981ef5ad2a0ac1379f7a/bcrypt-5.0.0-cp39-abi3-manylinux_2_34_x86_64.whl", hash = "sha256:611f0a17aa4a25a69362dcc299fda5c8a3d4f160e2abb3831041feb77393a14a", size = 278240, upload-time = "2025-09-25T19:50:23.305Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/89/48/44590e3fc158620f680a978aafe8f87a4c4320da81ed11552f0323aa9a57/bcrypt-5.0.0-cp39-abi3-musllinux_1_1_aarch64.whl", hash = "sha256:db99dca3b1fdc3db87d7c57eac0c82281242d1eabf19dcb8a6b10eb29a2e72d1", size = 311152, upload-time = "2025-09-25T19:50:24.597Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/5f/85/e4fbfc46f14f47b0d20493669a625da5827d07e8a88ee460af6cd9768b44/bcrypt-5.0.0-cp39-abi3-musllinux_1_1_x86_64.whl", hash = "sha256:5feebf85a9cefda32966d8171f5db7e3ba964b77fdfe31919622256f80f9cf42", size = 313284, upload-time = "2025-09-25T19:50:26.268Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/25/ae/479f81d3f4594456a01ea2f05b132a519eff9ab5768a70430fa1132384b1/bcrypt-5.0.0-cp39-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:3ca8a166b1140436e058298a34d88032ab62f15aae1c598580333dc21d27ef10", size = 341643, upload-time = "2025-09-25T19:50:28.02Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/df/d2/36a086dee1473b14276cd6ea7f61aef3b2648710b5d7f1c9e032c29b859f/bcrypt-5.0.0-cp39-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:61afc381250c3182d9078551e3ac3a41da14154fbff647ddf52a769f588c4172", size = 359698, upload-time = "2025-09-25T19:50:31.347Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c0/f6/688d2cd64bfd0b14d805ddb8a565e11ca1fb0fd6817175d58b10052b6d88/bcrypt-5.0.0-cp39-abi3-win32.whl", hash = "sha256:64d7ce196203e468c457c37ec22390f1a61c85c6f0b8160fd752940ccfb3a683", size = 153725, upload-time = "2025-09-25T19:50:34.384Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/9f/b9/9d9a641194a730bda138b3dfe53f584d61c58cd5230e37566e83ec2ffa0d/bcrypt-5.0.0-cp39-abi3-win_amd64.whl", hash = "sha256:64ee8434b0da054d830fa8e89e1c8bf30061d539044a39524ff7dec90481e5c2", size = 150912, upload-time = "2025-09-25T19:50:35.69Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/27/44/d2ef5e87509158ad2187f4dd0852df80695bb1ee0cfe0a684727b01a69e0/bcrypt-5.0.0-cp39-abi3-win_arm64.whl", hash = "sha256:f2347d3534e76bf50bca5500989d6c1d05ed64b440408057a37673282c654927", size = 144953, upload-time = "2025-09-25T19:50:37.32Z" },
|
||||||
|
]
|
||||||
|
|
||||||
[[package]]
|
[[package]]
|
||||||
name = "certifi"
|
name = "certifi"
|
||||||
version = "2025.8.3"
|
version = "2025.8.3"
|
||||||
|
|||||||
Reference in New Issue
Block a user