Compare commits
65 Commits
1.0.1
...
f3978381df
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
f3978381df | ||
|
|
24067847b4 | ||
|
|
0c37254317 | ||
|
|
9b82be9a6e | ||
|
|
26c6d3ffbc | ||
|
|
0ba70550c9 | ||
|
|
189e44a7ff | ||
|
|
e161508a61 | ||
|
|
0d13f903a0 | ||
|
|
12072dcbc8 | ||
|
|
f9139d82d7 | ||
|
|
0106702f41 | ||
|
|
1f7649fffe | ||
|
|
eb10e070b1 | ||
|
|
e7b789fcac | ||
|
|
90d79a71fb | ||
|
|
81074d839a | ||
|
|
76ab37f097 | ||
|
|
12385f685b | ||
|
|
9f36997166 | ||
|
|
8e6049e210 | ||
|
|
6f377b1ea1 | ||
|
|
7bcbe70392 | ||
|
|
b0cb4e555c | ||
|
|
27ed8dcd1f | ||
|
|
bd54fc72ad | ||
|
|
a5006b2faf | ||
|
|
27cf040f45 | ||
|
|
f30632df29 | ||
|
|
75f32234e0 | ||
|
|
e479381374 | ||
|
|
38f3686948 | ||
|
|
c43782c664 | ||
|
|
48113f6592 | ||
|
|
063ae3277f | ||
|
|
6e963cec51 | ||
|
|
c07d025873 | ||
|
|
d834ec2d4b | ||
|
|
eef70516a9 | ||
|
|
6ad4df6990 | ||
|
|
90e253b950 | ||
|
|
0753d1fc1d | ||
|
|
716e5066e1 | ||
|
|
9104c60956 | ||
|
|
76e3b53a4e | ||
|
|
f58332221b | ||
|
|
d9e45fed36 | ||
|
|
361611ae1b | ||
|
|
0d04a546cf | ||
|
|
a8c441ea6f | ||
|
|
5a0ae44a45 | ||
|
|
3669d0ca00 | ||
|
|
f22684d592 | ||
|
|
bb900ab1ee | ||
|
|
c16848a809 | ||
|
|
3714226b08 | ||
|
|
8f2565b5a9 | ||
|
|
669cf00bbc | ||
|
|
99d1ed1732 | ||
|
|
0e659072c0 | ||
|
|
592a9d7ce7 | ||
|
|
b045c62cee | ||
|
|
2560f61ee8 | ||
|
|
4b61921e7a | ||
|
|
fed8cb5653 |
9
.gitignore
vendored
9
.gitignore
vendored
@@ -25,9 +25,18 @@ logs/*
|
||||
# ignore secrets
|
||||
secrets.yaml
|
||||
|
||||
# ignore PostgreSQL config (contains credentials)
|
||||
config/postgres.yaml
|
||||
|
||||
# ignore db
|
||||
alpinebits.db
|
||||
|
||||
# ignore sql
|
||||
|
||||
*.sql
|
||||
|
||||
*.csv
|
||||
|
||||
# test output files
|
||||
test_output.txt
|
||||
output.xml
|
||||
|
||||
5
.vscode/extensions.json
vendored
Normal file
5
.vscode/extensions.json
vendored
Normal file
@@ -0,0 +1,5 @@
|
||||
{
|
||||
"recommendations": [
|
||||
"ms-python.python"
|
||||
]
|
||||
}
|
||||
18
.vscode/settings.json
vendored
18
.vscode/settings.json
vendored
@@ -30,19 +30,19 @@
|
||||
"terminal.integrated.profiles.linux": {
|
||||
"bash": {
|
||||
"path": "bash",
|
||||
"args": ["-c", "source ${workspaceFolder}/.venv/bin/activate && exec bash"]
|
||||
"args": [
|
||||
"-c",
|
||||
"source ${workspaceFolder}/.venv/bin/activate && exec bash"
|
||||
]
|
||||
}
|
||||
},
|
||||
"python.testing.pytestEnabled": true,
|
||||
"python.testing.pytestArgs": [
|
||||
"tests",
|
||||
"-v",
|
||||
"--tb=short"
|
||||
],
|
||||
"python.testing.pytestPath": "./.venv/bin/pytest",
|
||||
"python.testing.pytestArgs": ["tests"],
|
||||
"python.testing.pytestPath": "${workspaceFolder}/.venv/bin/pytest",
|
||||
"python.testing.unittestEnabled": false,
|
||||
"python.testing.autoTestDiscoverOnSaveEnabled": true,
|
||||
"python.testing.autoTestDiscoverOnSaveEnabled": false,
|
||||
"python.testing.cwd": "${workspaceFolder}",
|
||||
"python.testing.debugPort": 5678,
|
||||
"files.exclude": {
|
||||
"**/*.egg-info": true,
|
||||
"**/htmlcov": true,
|
||||
@@ -53,4 +53,4 @@
|
||||
"**/.mypy_cache": true,
|
||||
"**/.pytest_cache": true
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1 +1,5 @@
|
||||
This python project is managed by uv. Use uv run to execute app and tests.
|
||||
|
||||
The Configuration is handled in a config.yaml file. The annotatedyaml library is used to load secrets. !secret SOME_SECRET in the yaml file refers to a secret definition in a secrets.yaml file
|
||||
|
||||
When adding something to the config make sure to also add it to the voluptuos schema in config. If the config changes and there is an easy way to migrate an old config file do so. If its an addition then don't.
|
||||
|
||||
108
QUICK_REFERENCE.md
Normal file
108
QUICK_REFERENCE.md
Normal file
@@ -0,0 +1,108 @@
|
||||
# Multi-Worker Quick Reference
|
||||
|
||||
## TL;DR
|
||||
|
||||
**Problem**: Using 4 workers causes duplicate emails and race conditions.
|
||||
|
||||
**Solution**: File-based locking ensures only ONE worker runs schedulers.
|
||||
|
||||
## Commands
|
||||
|
||||
```bash
|
||||
# Development (1 worker - auto primary)
|
||||
uvicorn alpine_bits_python.api:app --reload
|
||||
|
||||
# Production (4 workers - one becomes primary)
|
||||
uvicorn alpine_bits_python.api:app --workers 4 --host 0.0.0.0 --port 8000
|
||||
|
||||
# Test worker coordination
|
||||
uv run python test_worker_coordination.py
|
||||
|
||||
# Run all tests
|
||||
uv run pytest tests/ -v
|
||||
```
|
||||
|
||||
## Check Which Worker is Primary
|
||||
|
||||
Look for startup logs:
|
||||
|
||||
```
|
||||
[INFO] Worker startup: pid=1001, primary=True ← PRIMARY
|
||||
[INFO] Worker startup: pid=1002, primary=False ← SECONDARY
|
||||
[INFO] Worker startup: pid=1003, primary=False ← SECONDARY
|
||||
[INFO] Worker startup: pid=1004, primary=False ← SECONDARY
|
||||
[INFO] Daily report scheduler started ← Only on PRIMARY
|
||||
```
|
||||
|
||||
## Lock File
|
||||
|
||||
**Location**: `/tmp/alpinebits_primary_worker.lock`
|
||||
|
||||
**Check lock status**:
|
||||
```bash
|
||||
# See which PID holds the lock
|
||||
cat /tmp/alpinebits_primary_worker.lock
|
||||
# Output: 1001
|
||||
|
||||
# Verify process is running
|
||||
ps aux | grep 1001
|
||||
```
|
||||
|
||||
**Clean stale lock** (if needed):
|
||||
```bash
|
||||
rm /tmp/alpinebits_primary_worker.lock
|
||||
# Then restart application
|
||||
```
|
||||
|
||||
## What Runs Where
|
||||
|
||||
| Service | Primary Worker | Secondary Workers |
|
||||
|---------|---------------|-------------------|
|
||||
| HTTP requests | ✓ Yes | ✓ Yes |
|
||||
| Email scheduler | ✓ Yes | ✗ No |
|
||||
| Error alerts | ✓ Yes | ✓ Yes (all workers can send) |
|
||||
| DB migrations | ✓ Yes | ✗ No |
|
||||
| Customer hashing | ✓ Yes | ✗ No |
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### All workers think they're primary
|
||||
**Cause**: Lock file not accessible
|
||||
**Fix**: Check permissions on `/tmp/` or change lock location
|
||||
|
||||
### No worker becomes primary
|
||||
**Cause**: Stale lock file
|
||||
**Fix**: `rm /tmp/alpinebits_primary_worker.lock` and restart
|
||||
|
||||
### Still getting duplicate emails
|
||||
**Check**: Are you seeing duplicate **scheduled reports** or **error alerts**?
|
||||
- Scheduled reports should only come from primary ✓
|
||||
- Error alerts can come from any worker (by design) ✓
|
||||
|
||||
## Code Example
|
||||
|
||||
```python
|
||||
from alpine_bits_python.worker_coordination import is_primary_worker
|
||||
|
||||
async def lifespan(app: FastAPI):
|
||||
# Acquire lock - only one worker succeeds
|
||||
is_primary, worker_lock = is_primary_worker()
|
||||
|
||||
if is_primary:
|
||||
# Start singleton services
|
||||
scheduler.start()
|
||||
|
||||
# All workers handle requests
|
||||
yield
|
||||
|
||||
# Release lock on shutdown
|
||||
if worker_lock:
|
||||
worker_lock.release()
|
||||
```
|
||||
|
||||
## Documentation
|
||||
|
||||
- **Full guide**: `docs/MULTI_WORKER_DEPLOYMENT.md`
|
||||
- **Solution summary**: `SOLUTION_SUMMARY.md`
|
||||
- **Implementation**: `src/alpine_bits_python/worker_coordination.py`
|
||||
- **Test script**: `test_worker_coordination.py`
|
||||
75
README.md
75
README.md
@@ -1,11 +1,10 @@
|
||||
# Übersicht
|
||||
|
||||
Enthält einen in Python geschriebenen Alpine Bits Server zur Übertragung von Buchungsanfragen von Landingpages an Partnerhotels. Ein Fastapi Endpoint empfängt Anfrageformulare von den wix.com landingpages, und speichert sie in die Datenbank ab. Der Alpine Bits Server stellt diese dann Hotels auf dem Endpoint `www.99tales.net/api/alpinebits/server-2024-10` zu Verfügung.
|
||||
|
||||
Enthält einen in Python geschriebenen Alpine Bits Server zur Übertragung von Buchungsanfragen von Landingpages an Partnerhotels. Ein Fastapi Endpoint empfängt Anfrageformulare von den wix.com landingpages, und speichert sie in die Datenbank ab. Der Alpine Bits Server stellt diese dann Hotels auf dem Endpoint `www.99tales.net/api/alpinebits/server-2024-10` zu Verfügung.
|
||||
|
||||
## Entwicklung
|
||||
|
||||
Auf dem Entwicklungsystem muss git und der uv python package manager installiert sein.
|
||||
Auf dem Entwicklungsystem muss git und der uv python package manager installiert sein.
|
||||
|
||||
### Git Authentification
|
||||
|
||||
@@ -21,7 +20,7 @@ Erfolgt über zwei yaml files. Zu konfigurieren ist die Verbindung zur Datenbank
|
||||
|
||||
```yaml
|
||||
database:
|
||||
url: "sqlite+aiosqlite:///alpinebits.db" # For local dev, use SQLite. For prod, override with PostgreSQL URL.
|
||||
url: "sqlite+aiosqlite:///alpinebits.db" # For local dev, use SQLite. For prod, override with PostgreSQL URL.
|
||||
# url: "postgresql://user:password@host:port/dbname" # Example for Postgres
|
||||
|
||||
alpine_bits_auth:
|
||||
@@ -43,18 +42,20 @@ ALICE_PASSWORD: "supersecretpassword123"
|
||||
|
||||
## Deployment
|
||||
|
||||
Die Applikation wird in einem Dockercontainer deployed. Um das Container Image zu erstellen ist folgender Befehl notwendig
|
||||
Die Applikation wird in einem Dockercontainer deployed. Um das Container Image zu erstellen ist folgender Befehl notwendig
|
||||
|
||||
```bash
|
||||
uv sync
|
||||
docker build . -t gitea.linter-home.com/jonas/asa_api:master
|
||||
```
|
||||
Dieser Befehl muss im Wurzelverzeichnis der Repository ausgeführt werden. `pwd` sollte irgendwas/alpinebits_python ergeben. Der Punkt hinter dem docker build befehl verweißt nämlich auf das lokale Dockerfile. "-t" steht für tag. In diesem Beispiel wird das Image mit dem Tag `gitea.linter-home.com/jonas/asa_api:master` versehen.
|
||||
|
||||
Ideal wäre eine Build Pipeline in Gitea selbst aber dies aufzusetzen ist etwas schwierig und es ist gut möglich das die Hetzner VM das nicht herhat. Lokal bei mir zuhause ist dies aufgesetzt. War alles andere als leicht.
|
||||
Dieser Build Befehl bezieht sich noch auf die Automatische Buildpipeline in meinem Heimsystem. Eine solche Pipeline habe ich auf dem 99tales.net server noch nicht eingerichtet weils lästiges Zeug isch.
|
||||
|
||||
Am besten einfach direkt auf dem Zielsystem den Container bauen und im Docker Compose File dann auf dieses Image referenzieren.
|
||||
Dieser Befehl muss im Wurzelverzeichnis der Repository ausgeführt werden. `pwd` sollte irgendwas/alpinebits_python ergeben. Der Punkt hinter dem docker build befehl verweißt nämlich auf das lokale Dockerfile. "-t" steht für tag. In diesem Beispiel wird das Image mit dem Tag `gitea.linter-home.com/jonas/asa_api:master` versehen.
|
||||
|
||||
Ideal wäre eine Build Pipeline in Gitea selbst aber dies aufzusetzen ist etwas schwierig und es ist gut möglich das die Hetzner VM das nicht herhat. Lokal bei mir zuhause ist dies aufgesetzt. War alles andere als leicht.
|
||||
|
||||
Am besten einfach direkt auf dem Zielsystem den Container bauen und im Docker Compose File dann auf dieses Image referenzieren.
|
||||
|
||||
### Docker Compose Beispiel mit Traefik Reverse Proxy
|
||||
|
||||
@@ -64,29 +65,27 @@ services:
|
||||
image: gitea.linter-home.com/jonas/asa_api:master
|
||||
container_name: asa_connector
|
||||
restart: unless-stopped
|
||||
|
||||
|
||||
# Environment variables via .env file
|
||||
env_file:
|
||||
- asa_connector.env
|
||||
|
||||
|
||||
networks:
|
||||
- external
|
||||
|
||||
|
||||
# Only expose internally - Traefik will handle external access
|
||||
expose:
|
||||
- "8000"
|
||||
|
||||
user: "1000:1000" # Run as user with UID 1000 and GID 1000
|
||||
user: "1000:1000" # Run as user with UID 1000 and GID 1000
|
||||
|
||||
environment:
|
||||
- ALPINEBITS_CONFIG_DIR=/config
|
||||
- ALPINE_BITS_CONFIG_DIR=/config
|
||||
|
||||
volumes:
|
||||
- /home/jonas/asa_connector_logs:/app/src/logs
|
||||
- /home/jonas/alpinebits_config:/config
|
||||
|
||||
|
||||
|
||||
# Traefik labels for automatic service discovery
|
||||
labels:
|
||||
- "traefik.enable=true"
|
||||
@@ -96,12 +95,12 @@ services:
|
||||
- "traefik.http.routers.asa_connector.tls.certresolver=letsencrypt"
|
||||
- "traefik.http.services.asa_connector.loadbalancer.server.port=8000"
|
||||
- "traefik.http.routers.asa_connector.priority=100"
|
||||
|
||||
|
||||
# Redirect middleware for non-API paths
|
||||
- "traefik.http.middlewares.redirect-to-99tales-it.redirectregex.regex=^https://99tales\\.net/(.*)$$"
|
||||
- "traefik.http.middlewares.redirect-to-99tales-it.redirectregex.replacement=https://99tales.it/$${1}"
|
||||
- "traefik.http.middlewares.redirect-to-99tales-it.redirectregex.permanent=true"
|
||||
|
||||
|
||||
# Catch-all router for non-API paths on 99tales.net (lower priority)
|
||||
- "traefik.http.routers.redirect-router.rule=Host(`99tales.net`)"
|
||||
- "traefik.http.routers.redirect-router.entrypoints=https"
|
||||
@@ -121,16 +120,14 @@ services:
|
||||
networks:
|
||||
- external
|
||||
|
||||
volumes:
|
||||
volumes:
|
||||
- /var/run/docker.sock:/var/run/docker.sock
|
||||
|
||||
|
||||
traefik:
|
||||
image: traefik:latest
|
||||
container_name: traefik
|
||||
restart: unless-stopped
|
||||
|
||||
|
||||
environment:
|
||||
- DOCKER_HOST=dockerproxy
|
||||
|
||||
@@ -138,18 +135,22 @@ services:
|
||||
- external
|
||||
|
||||
ports:
|
||||
- "80:80" # HTTP
|
||||
- "443:443" # HTTPS
|
||||
- "22:22" # SSH for Gitea
|
||||
|
||||
- "80:80" # HTTP
|
||||
- "443:443" # HTTPS
|
||||
- "22:22" # SSH for Gitea
|
||||
|
||||
volumes:
|
||||
- /home/jonas/traefik:/etc/traefik # Traefik configuration files
|
||||
- /home/jonas/traefik:/etc/traefik # Traefik configuration files
|
||||
|
||||
|
||||
|
||||
# Health check
|
||||
healthcheck:
|
||||
test: ["CMD", "python", "-c", "import requests; requests.get('http://localhost:8000/health', timeout=5)"]
|
||||
test:
|
||||
[
|
||||
"CMD",
|
||||
"python",
|
||||
"-c",
|
||||
"import requests; requests.get('http://localhost:8000/health', timeout=5)",
|
||||
]
|
||||
interval: 30s
|
||||
timeout: 10s
|
||||
retries: 3
|
||||
@@ -163,20 +164,6 @@ networks: # custom bridge network named 'external'
|
||||
|
||||
Damit das ganze auch funktioniert müssen dns Einträge auf die Virtuelle Machine zeigen in der das ganze läuft. Wurde bei Hostinger für 99tales.net eingerichtet.
|
||||
|
||||
Wie in dem Beispiel ersichtlich wird sowohl ein Log Ordner als auch ein Config ordner in den Container gemapped. Diesen am besten auf dem Host vor Erstellung des Containers erstellen.
|
||||
|
||||
Die Umgebungsvariable `ALPINEBITS_CONFIG_DIR` sagt dann dem Programm wo es die Config finden soll. In dem Ordner kann man die obens erwähnten Konfigurationsdateien speichern. Falls sqlite als Datenbank verwendet wird, findet man dort auch die Datenbank nach erstem ausführen.
|
||||
|
||||
|
||||
|
||||
# TODO Liste
|
||||
|
||||
Need a table in the database that stores requests that have already been acknowledged by the client. Should contain client_id + a list of all acked unique_ids
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
Wie in dem Beispiel ersichtlich wird sowohl ein Log Ordner als auch ein Config ordner in den Container gemapped. Diesen am besten auf dem Host vor Erstellung des Containers erstellen.
|
||||
|
||||
Die Umgebungsvariable `ALPINE_BITS_CONFIG_DIR` sagt dann dem Programm wo es die Config finden soll. In dem Ordner kann man die obens erwähnten Konfigurationsdateien speichern. Falls sqlite als Datenbank verwendet wird, findet man dort auch die Datenbank nach erstem ausführen.
|
||||
|
||||
193
SOLUTION_SUMMARY.md
Normal file
193
SOLUTION_SUMMARY.md
Normal file
@@ -0,0 +1,193 @@
|
||||
# Multi-Worker Deployment Solution Summary
|
||||
|
||||
## Problem
|
||||
|
||||
When running FastAPI with `uvicorn --workers 4`, the `lifespan` function executes in **all 4 worker processes**, causing:
|
||||
|
||||
- ❌ **Duplicate email notifications** (4x emails sent)
|
||||
- ❌ **Multiple schedulers** running simultaneously
|
||||
- ❌ **Race conditions** in database operations
|
||||
|
||||
## Root Cause
|
||||
|
||||
Your original implementation tried to detect the primary worker using:
|
||||
|
||||
```python
|
||||
multiprocessing.current_process().name == "MainProcess"
|
||||
```
|
||||
|
||||
**This doesn't work** because with `uvicorn --workers N`, each worker is a separate process with its own name, and none are reliably named "MainProcess".
|
||||
|
||||
## Solution Implemented
|
||||
|
||||
### File-Based Worker Locking
|
||||
|
||||
We implemented a **file-based locking mechanism** that ensures only ONE worker runs singleton services:
|
||||
|
||||
```python
|
||||
# worker_coordination.py
|
||||
class WorkerLock:
|
||||
"""Uses fcntl.flock() to coordinate workers across processes"""
|
||||
|
||||
def acquire(self) -> bool:
|
||||
"""Try to acquire exclusive lock - only one process succeeds"""
|
||||
fcntl.flock(self.lock_fd.fileno(), fcntl.LOCK_EX | fcntl.LOCK_NB)
|
||||
```
|
||||
|
||||
### Updated Lifespan Function
|
||||
|
||||
```python
|
||||
async def lifespan(app: FastAPI):
|
||||
# File-based lock ensures only one worker is primary
|
||||
is_primary, worker_lock = is_primary_worker()
|
||||
|
||||
if is_primary:
|
||||
# ✓ Start email scheduler (ONCE)
|
||||
# ✓ Run database migrations (ONCE)
|
||||
# ✓ Start background tasks (ONCE)
|
||||
else:
|
||||
# Skip singleton services
|
||||
pass
|
||||
|
||||
# All workers handle HTTP requests normally
|
||||
yield
|
||||
|
||||
# Release lock on shutdown
|
||||
if worker_lock:
|
||||
worker_lock.release()
|
||||
```
|
||||
|
||||
## How It Works
|
||||
|
||||
```
|
||||
uvicorn --workers 4
|
||||
│
|
||||
├─ Worker 0 → tries lock → ✓ SUCCESS → PRIMARY (runs schedulers)
|
||||
├─ Worker 1 → tries lock → ✗ BUSY → SECONDARY (handles requests)
|
||||
├─ Worker 2 → tries lock → ✗ BUSY → SECONDARY (handles requests)
|
||||
└─ Worker 3 → tries lock → ✗ BUSY → SECONDARY (handles requests)
|
||||
```
|
||||
|
||||
## Verification
|
||||
|
||||
### Test Results
|
||||
|
||||
```bash
|
||||
$ uv run python test_worker_coordination.py
|
||||
|
||||
Worker 0 (PID 30773): ✓ I am PRIMARY
|
||||
Worker 1 (PID 30774): ✗ I am SECONDARY
|
||||
Worker 2 (PID 30775): ✗ I am SECONDARY
|
||||
Worker 3 (PID 30776): ✗ I am SECONDARY
|
||||
✓ Test complete: Only ONE worker should have been PRIMARY
|
||||
```
|
||||
|
||||
### All Tests Pass
|
||||
|
||||
```bash
|
||||
$ uv run pytest tests/ -v
|
||||
======================= 120 passed, 23 warnings in 1.96s =======================
|
||||
```
|
||||
|
||||
## Files Modified
|
||||
|
||||
1. **`worker_coordination.py`** (NEW)
|
||||
- `WorkerLock` class with `fcntl` file locking
|
||||
- `is_primary_worker()` function for easy integration
|
||||
|
||||
2. **`api.py`** (MODIFIED)
|
||||
- Import `is_primary_worker` from worker_coordination
|
||||
- Replace manual worker detection with file-based locking
|
||||
- Use `is_primary` flag to conditionally start schedulers
|
||||
- Release lock on shutdown
|
||||
|
||||
## Advantages of This Solution
|
||||
|
||||
✅ **No external dependencies** - uses standard library `fcntl`
|
||||
✅ **Automatic failover** - if primary crashes, lock is auto-released
|
||||
✅ **Works with any ASGI server** - uvicorn, gunicorn, hypercorn
|
||||
✅ **Simple and reliable** - battle-tested Unix file locking
|
||||
✅ **No race conditions** - atomic lock acquisition
|
||||
✅ **Production-ready** - handles edge cases gracefully
|
||||
|
||||
## Usage
|
||||
|
||||
### Development (Single Worker)
|
||||
```bash
|
||||
uvicorn alpine_bits_python.api:app --reload
|
||||
# Single worker becomes primary automatically
|
||||
```
|
||||
|
||||
### Production (Multiple Workers)
|
||||
```bash
|
||||
uvicorn alpine_bits_python.api:app --workers 4
|
||||
# Worker that starts first becomes primary
|
||||
# Others become secondary workers
|
||||
```
|
||||
|
||||
### Check Logs
|
||||
```
|
||||
[INFO] Worker startup: process=SpawnProcess-1, pid=1001, primary=True
|
||||
[INFO] Worker startup: process=SpawnProcess-2, pid=1002, primary=False
|
||||
[INFO] Worker startup: process=SpawnProcess-3, pid=1003, primary=False
|
||||
[INFO] Worker startup: process=SpawnProcess-4, pid=1004, primary=False
|
||||
[INFO] Daily report scheduler started # ← Only on primary!
|
||||
```
|
||||
|
||||
## What This Fixes
|
||||
|
||||
| Issue | Before | After |
|
||||
|-------|--------|-------|
|
||||
| **Email notifications** | Sent 4x (one per worker) | Sent 1x (only primary) |
|
||||
| **Daily report scheduler** | 4 schedulers running | 1 scheduler running |
|
||||
| **Customer hashing** | Race condition across workers | Only primary hashes |
|
||||
| **Startup logs** | Confusing worker detection | Clear primary/secondary status |
|
||||
|
||||
## Alternative Approaches Considered
|
||||
|
||||
### ❌ Environment Variables
|
||||
```bash
|
||||
ALPINEBITS_PRIMARY_WORKER=true uvicorn app:app
|
||||
```
|
||||
**Problem**: Manual configuration, no automatic failover
|
||||
|
||||
### ❌ Process Name Detection
|
||||
```python
|
||||
multiprocessing.current_process().name == "MainProcess"
|
||||
```
|
||||
**Problem**: Unreliable with uvicorn's worker processes
|
||||
|
||||
### ✅ Redis-Based Locking
|
||||
```python
|
||||
redis.lock.Lock(redis_client, "primary_worker")
|
||||
```
|
||||
**When to use**: Multi-container deployments (Docker Swarm, Kubernetes)
|
||||
|
||||
## Recommendations
|
||||
|
||||
### For Single-Host Deployments (Your Case)
|
||||
✅ Use the file-based locking solution (implemented)
|
||||
|
||||
### For Multi-Container Deployments
|
||||
Consider Redis-based locks if deploying across multiple containers/hosts:
|
||||
|
||||
```python
|
||||
# In worker_coordination.py, add Redis option
|
||||
def is_primary_worker(use_redis=False):
|
||||
if use_redis:
|
||||
return redis_based_lock()
|
||||
else:
|
||||
return file_based_lock() # Current implementation
|
||||
```
|
||||
|
||||
## Conclusion
|
||||
|
||||
Your FastAPI application now correctly handles multiple workers:
|
||||
|
||||
- ✅ Only **one worker** runs singleton services (schedulers, migrations)
|
||||
- ✅ All **workers** handle HTTP requests concurrently
|
||||
- ✅ No **duplicate email notifications**
|
||||
- ✅ No **race conditions** in database operations
|
||||
- ✅ **Automatic failover** if primary worker crashes
|
||||
|
||||
**Result**: You get the performance benefits of multiple workers WITHOUT the duplicate notification problem! 🎉
|
||||
@@ -14059,3 +14059,55 @@ IndexError: list index out of range
|
||||
2025-10-10 10:59:53 - alpine_bits_python.api - INFO - Hotel 39040_001 has no push_endpoint configured
|
||||
2025-10-10 10:59:53 - alpine_bits_python.api - INFO - Database tables checked/created at startup.
|
||||
2025-10-10 10:59:53 - httpx - INFO - HTTP Request: PUT http://testserver/api/hoteldata/conversions_import/test_reservation.xml "HTTP/1.1 401 Unauthorized"
|
||||
2025-10-15 08:49:50 - root - INFO - Logging to file: alpinebits.log
|
||||
2025-10-15 08:49:50 - root - INFO - Logging configured at INFO level
|
||||
2025-10-15 08:49:52 - alpine_bits_python.email_service - INFO - Email service initialized: smtp.gmail.com:587
|
||||
2025-10-15 08:49:52 - root - INFO - Logging to file: alpinebits.log
|
||||
2025-10-15 08:49:52 - root - INFO - Logging configured at INFO level
|
||||
2025-10-15 08:49:54 - alpine_bits_python.email_service - INFO - Email service initialized: smtp.gmail.com:587
|
||||
2025-10-15 08:52:37 - root - INFO - Logging to file: alpinebits.log
|
||||
2025-10-15 08:52:37 - root - INFO - Logging configured at INFO level
|
||||
2025-10-15 08:52:54 - root - INFO - Logging to file: alpinebits.log
|
||||
2025-10-15 08:52:54 - root - INFO - Logging configured at INFO level
|
||||
2025-10-15 08:52:56 - alpine_bits_python.email_service - INFO - Email service initialized: smtp.titan.email:465
|
||||
2025-10-15 08:52:56 - root - INFO - Logging to file: alpinebits.log
|
||||
2025-10-15 08:52:56 - root - INFO - Logging configured at INFO level
|
||||
2025-10-15 08:52:58 - alpine_bits_python.email_service - INFO - Email service initialized: smtp.titan.email:465
|
||||
2025-10-16 16:15:42 - root - INFO - Logging to file: alpinebits.log
|
||||
2025-10-16 16:15:42 - root - INFO - Logging configured at INFO level
|
||||
2025-10-16 16:15:42 - alpine_bits_python.email_monitoring - INFO - DailyReportScheduler initialized: send_time=08:00, recipients=[]
|
||||
2025-10-16 16:15:42 - root - INFO - Daily report scheduler configured for Pushover (primary worker)
|
||||
2025-10-16 16:15:42 - alpine_bits_python.api - INFO - Application startup initiated (primary_worker=True)
|
||||
2025-10-16 16:15:42 - alpine_bits_python.alpinebits_server - INFO - Initializing action instance for AlpineBitsActionName.OTA_HOTEL_NOTIF_REPORT
|
||||
2025-10-16 16:15:42 - alpine_bits_python.alpinebits_server - INFO - Initializing action instance for AlpineBitsActionName.OTA_PING
|
||||
2025-10-16 16:15:42 - alpine_bits_python.alpinebits_server - INFO - Initializing action instance for AlpineBitsActionName.OTA_HOTEL_RES_NOTIF_GUEST_REQUESTS
|
||||
2025-10-16 16:15:42 - alpine_bits_python.alpinebits_server - INFO - Initializing action instance for AlpineBitsActionName.OTA_READ
|
||||
2025-10-16 16:15:42 - alpine_bits_python.api - INFO - Hotel 39054_001 has no push_endpoint configured
|
||||
2025-10-16 16:15:42 - alpine_bits_python.api - INFO - Hotel 135 has no push_endpoint configured
|
||||
2025-10-16 16:15:42 - alpine_bits_python.api - INFO - Hotel 39052_001 has no push_endpoint configured
|
||||
2025-10-16 16:15:42 - alpine_bits_python.api - INFO - Hotel 39040_001 has no push_endpoint configured
|
||||
2025-10-16 16:15:42 - alpine_bits_python.migrations - INFO - Starting database migrations...
|
||||
2025-10-16 16:15:42 - alpine_bits_python.migrations - INFO - Running migration: add_room_types
|
||||
2025-10-16 16:15:42 - alpine_bits_python.migrations - INFO - Adding column reservations.room_type_code (VARCHAR)
|
||||
2025-10-16 16:15:42 - alpine_bits_python.migrations - INFO - Successfully added column reservations.room_type_code
|
||||
2025-10-16 16:15:42 - alpine_bits_python.migrations - INFO - Adding column reservations.room_classification_code (VARCHAR)
|
||||
2025-10-16 16:15:42 - alpine_bits_python.migrations - INFO - Successfully added column reservations.room_classification_code
|
||||
2025-10-16 16:15:42 - alpine_bits_python.migrations - INFO - Adding column reservations.room_type (VARCHAR)
|
||||
2025-10-16 16:15:42 - alpine_bits_python.migrations - INFO - Successfully added column reservations.room_type
|
||||
2025-10-16 16:15:42 - alpine_bits_python.migrations - INFO - Migration add_room_types: Added 3 columns
|
||||
2025-10-16 16:15:42 - alpine_bits_python.migrations - INFO - Database migrations completed successfully
|
||||
2025-10-16 16:15:42 - alpine_bits_python.api - INFO - Database tables checked/created at startup.
|
||||
2025-10-16 16:15:42 - alpine_bits_python.api - INFO - All existing customers already have hashed data
|
||||
2025-10-16 16:15:42 - alpine_bits_python.email_monitoring - INFO - ReservationStatsCollector initialized with 4 hotels
|
||||
2025-10-16 16:15:42 - alpine_bits_python.api - INFO - Stats collector initialized and hooked up to report scheduler
|
||||
2025-10-16 16:15:42 - alpine_bits_python.api - INFO - Sending test daily report on startup (last 24 hours)
|
||||
2025-10-16 16:15:42 - alpine_bits_python.email_monitoring - INFO - Collecting reservation stats from 2025-10-15 16:15:42 to 2025-10-16 16:15:42
|
||||
2025-10-16 16:15:42 - alpine_bits_python.email_monitoring - INFO - Collected stats: 9 total reservations across 1 hotels
|
||||
2025-10-16 16:15:42 - alpine_bits_python.email_service - WARNING - No recipients specified for email: AlpineBits Daily Report - 2025-10-16
|
||||
2025-10-16 16:15:42 - alpine_bits_python.api - ERROR - Failed to send test daily report via email on startup
|
||||
2025-10-16 16:15:42 - alpine_bits_python.pushover_service - INFO - Pushover notification sent successfully: AlpineBits Daily Report - 2025-10-16
|
||||
2025-10-16 16:15:42 - alpine_bits_python.api - INFO - Test daily report sent via Pushover successfully on startup
|
||||
2025-10-16 16:15:42 - alpine_bits_python.email_monitoring - INFO - Daily report scheduler started
|
||||
2025-10-16 16:15:42 - alpine_bits_python.api - INFO - Daily report scheduler started
|
||||
2025-10-16 16:15:42 - alpine_bits_python.api - INFO - Application startup complete
|
||||
2025-10-16 16:15:42 - alpine_bits_python.email_monitoring - INFO - Next daily report scheduled for 2025-10-17 08:00:00 (in 15.7 hours)
|
||||
|
||||
BIN
alpinebits_capi_test.db
Normal file
BIN
alpinebits_capi_test.db
Normal file
Binary file not shown.
338636
config/alpinebits.log
Normal file
338636
config/alpinebits.log
Normal file
File diff suppressed because it is too large
Load Diff
@@ -2,40 +2,107 @@
|
||||
# Use annotatedyaml for secrets and environment-specific overrides
|
||||
|
||||
database:
|
||||
url: "sqlite+aiosqlite:///alpinebits.db" # For local dev, use SQLite. For prod, override with PostgreSQL URL.
|
||||
# url: "postgresql://user:password@host:port/dbname" # Example for Postgres
|
||||
|
||||
url: "postgresql+asyncpg://meta_user:meta_password@localhost:5555/meta_insights"
|
||||
schema: "alpinebits"
|
||||
# AlpineBits Python config
|
||||
# Use annotatedyaml for secrets and environment-specific overrides
|
||||
|
||||
logger:
|
||||
level: "INFO" # Set to DEBUG for more verbose output
|
||||
file: "config/alpinebits.log" # Log file path, or null for console only
|
||||
|
||||
server:
|
||||
codecontext: "ADVERTISING"
|
||||
code: 70597314
|
||||
companyname: "99tales Gmbh"
|
||||
res_id_source_context: "99tales"
|
||||
|
||||
|
||||
|
||||
logger:
|
||||
level: "INFO" # Set to DEBUG for more verbose output
|
||||
file: "alpinebits.log" # Log file path, or null for console only
|
||||
|
||||
alpine_bits_auth:
|
||||
- hotel_id: "39054_001"
|
||||
hotel_name: "Bemelmans Post"
|
||||
username: "bemelman"
|
||||
password: !secret BEMELMANS_PASSWORD
|
||||
meta_account: "238334370765317"
|
||||
google_account: "7581209925" # Optional: Meta advertising account ID
|
||||
|
||||
|
||||
- hotel_id: "135"
|
||||
hotel_name: "Testhotel"
|
||||
username: "sebastian"
|
||||
password: !secret BOB_PASSWORD
|
||||
|
||||
|
||||
- hotel_id: "39052_001"
|
||||
hotel_name: "Jagthof Kaltern"
|
||||
username: "jagthof"
|
||||
password: !secret JAGTHOF_PASSWORD
|
||||
meta_account: "948363300784757"
|
||||
google_account: "1951919786" # Optional: Meta advertising account ID
|
||||
|
||||
|
||||
- hotel_id: "39040_001"
|
||||
hotel_name: "Residence Erika"
|
||||
username: "erika"
|
||||
password: !secret ERIKA_PASSWORD
|
||||
google_account: "6604634947"
|
||||
|
||||
|
||||
api_tokens:
|
||||
- tLTI8wXF1OVEvUX7kdZRhSW3Qr5feBCz0mHo-kbnEp0
|
||||
|
||||
|
||||
# Email configuration (SMTP service config - kept for when port is unblocked)
|
||||
email:
|
||||
# SMTP server configuration
|
||||
smtp:
|
||||
host: "smtp.titan.email" # Your SMTP server
|
||||
port: 465 # Usually 587 for TLS, 465 for SSL
|
||||
username: info@99tales.net # SMTP username
|
||||
password: !secret EMAIL_PASSWORD # SMTP password
|
||||
use_tls: false # Use STARTTLS
|
||||
use_ssl: true # Use SSL/TLS from start
|
||||
|
||||
# Email addresses
|
||||
from_address: "info@99tales.net" # Sender address
|
||||
from_name: "AlpineBits Monitor" # Sender display name
|
||||
|
||||
# Pushover configuration (push notification service config)
|
||||
pushover:
|
||||
# Pushover API credentials (get from https://pushover.net)
|
||||
user_key: !secret PUSHOVER_USER_KEY # Your user/group key
|
||||
api_token: !secret PUSHOVER_API_TOKEN # Your application API token
|
||||
|
||||
# Unified notification system - recipient-based routing
|
||||
notifications:
|
||||
# Recipients and their preferred notification methods
|
||||
recipients:
|
||||
- name: "jonas"
|
||||
methods:
|
||||
# Uncomment email when port is unblocked
|
||||
#- type: "email"
|
||||
# address: "jonas@vaius.ai"
|
||||
- type: "pushover"
|
||||
priority: 0 # Pushover priority: -2=lowest, -1=low, 0=normal, 1=high, 2=emergency
|
||||
|
||||
# Daily report configuration (applies to all recipients)
|
||||
daily_report:
|
||||
enabled: false # Set to true to enable daily reports
|
||||
send_time: "08:00" # Time to send daily report (24h format, local time)
|
||||
include_stats: true # Include reservation/customer stats
|
||||
include_errors: true # Include error summary
|
||||
|
||||
# Error alert configuration (applies to all recipients)
|
||||
error_alerts:
|
||||
enabled: false # Set to true to enable error alerts
|
||||
# Alert is sent immediately if threshold is reached
|
||||
error_threshold: 5 # Send immediate alert after N errors
|
||||
# Otherwise, alert is sent after buffer time expires
|
||||
buffer_minutes: 15 # Wait N minutes before sending buffered errors
|
||||
# Cooldown period to prevent alert spam
|
||||
cooldown_minutes: 15 # Wait N min before sending another alert
|
||||
# Error severity levels to monitor
|
||||
log_levels:
|
||||
- "ERROR"
|
||||
- "CRITICAL"
|
||||
|
||||
|
||||
|
||||
16
config/postgres.yaml.example
Normal file
16
config/postgres.yaml.example
Normal file
@@ -0,0 +1,16 @@
|
||||
# PostgreSQL configuration for migration
|
||||
# Copy this file to postgres.yaml and fill in your PostgreSQL credentials
|
||||
# This file should NOT be committed to git (add postgres.yaml to .gitignore)
|
||||
|
||||
database:
|
||||
url: "postgresql+asyncpg://username:password@hostname:5432/database_name"
|
||||
# Example: "postgresql+asyncpg://alpinebits_user:your_password@localhost:5432/alpinebits"
|
||||
schema: "alpinebits" # Optional: PostgreSQL schema name (default: public)
|
||||
|
||||
# If using annotatedyaml secrets:
|
||||
# database:
|
||||
# url: !secret POSTGRES_URL
|
||||
# schema: "alpinebits" # Optional: PostgreSQL schema name
|
||||
#
|
||||
# Then in secrets.yaml:
|
||||
# POSTGRES_URL: "postgresql+asyncpg://username:password@hostname:5432/database_name"
|
||||
423
docs/EMAIL_MONITORING.md
Normal file
423
docs/EMAIL_MONITORING.md
Normal file
@@ -0,0 +1,423 @@
|
||||
# Email Monitoring and Alerting
|
||||
|
||||
This document describes the email monitoring and alerting system for the AlpineBits Python server.
|
||||
|
||||
## Overview
|
||||
|
||||
The email monitoring system provides two main features:
|
||||
|
||||
1. **Error Alerts**: Automatic email notifications when errors occur in the application
|
||||
2. **Daily Reports**: Scheduled daily summary emails with statistics and error logs
|
||||
|
||||
## Architecture
|
||||
|
||||
### Components
|
||||
|
||||
- **EmailService** ([email_service.py](../src/alpine_bits_python/email_service.py)): Core SMTP email sending functionality
|
||||
- **EmailAlertHandler** ([email_monitoring.py](../src/alpine_bits_python/email_monitoring.py)): Custom logging handler that captures errors and sends alerts
|
||||
- **DailyReportScheduler** ([email_monitoring.py](../src/alpine_bits_python/email_monitoring.py)): Background task that sends daily reports
|
||||
|
||||
### How It Works
|
||||
|
||||
#### Error Alerts (Hybrid Approach)
|
||||
|
||||
The `EmailAlertHandler` uses a **hybrid threshold + time-based** approach:
|
||||
|
||||
1. **Immediate Alerts**: If the error threshold is reached (e.g., 5 errors), an alert email is sent immediately
|
||||
2. **Buffered Alerts**: Otherwise, errors accumulate in a buffer and are sent after the buffer duration (e.g., 15 minutes)
|
||||
3. **Cooldown Period**: After sending an alert, the system waits for a cooldown period before sending another alert to prevent spam
|
||||
|
||||
**Flow Diagram:**
|
||||
```
|
||||
Error occurs
|
||||
↓
|
||||
Add to buffer
|
||||
↓
|
||||
Buffer >= threshold? ──Yes──> Send immediate alert
|
||||
↓ No ↓
|
||||
Wait for buffer time Reset buffer
|
||||
↓ ↓
|
||||
Send buffered alert Enter cooldown
|
||||
↓
|
||||
Reset buffer
|
||||
```
|
||||
|
||||
#### Daily Reports
|
||||
|
||||
The `DailyReportScheduler` runs as a background task that:
|
||||
|
||||
1. Waits until the configured send time (e.g., 8:00 AM)
|
||||
2. Collects statistics from the application
|
||||
3. Gathers errors that occurred during the day
|
||||
4. Formats and sends an email report
|
||||
5. Clears the error log
|
||||
6. Schedules the next report for the following day
|
||||
|
||||
## Configuration
|
||||
|
||||
### Email Configuration Keys
|
||||
|
||||
Add the following to your [config.yaml](../config/config.yaml):
|
||||
|
||||
```yaml
|
||||
email:
|
||||
# SMTP server configuration
|
||||
smtp:
|
||||
host: "smtp.gmail.com" # Your SMTP server hostname
|
||||
port: 587 # SMTP port (587 for TLS, 465 for SSL)
|
||||
username: !secret EMAIL_USERNAME # SMTP username (use !secret for env vars)
|
||||
password: !secret EMAIL_PASSWORD # SMTP password (use !secret for env vars)
|
||||
use_tls: true # Use STARTTLS encryption
|
||||
use_ssl: false # Use SSL/TLS from start (mutually exclusive with use_tls)
|
||||
|
||||
# Sender information
|
||||
from_address: "noreply@99tales.com"
|
||||
from_name: "AlpineBits Monitor"
|
||||
|
||||
# Monitoring and alerting
|
||||
monitoring:
|
||||
# Daily report configuration
|
||||
daily_report:
|
||||
enabled: true # Enable/disable daily reports
|
||||
recipients:
|
||||
- "admin@99tales.com"
|
||||
- "dev@99tales.com"
|
||||
send_time: "08:00" # Time to send (24h format, local time)
|
||||
include_stats: true # Include application statistics
|
||||
include_errors: true # Include error summary
|
||||
|
||||
# Error alert configuration
|
||||
error_alerts:
|
||||
enabled: true # Enable/disable error alerts
|
||||
recipients:
|
||||
- "alerts@99tales.com"
|
||||
- "oncall@99tales.com"
|
||||
error_threshold: 5 # Send immediate alert after N errors
|
||||
buffer_minutes: 15 # Wait N minutes before sending buffered errors
|
||||
cooldown_minutes: 15 # Wait N minutes before sending another alert
|
||||
log_levels: # Log levels to monitor
|
||||
- "ERROR"
|
||||
- "CRITICAL"
|
||||
```
|
||||
|
||||
### Environment Variables
|
||||
|
||||
For security, store sensitive credentials in environment variables:
|
||||
|
||||
```bash
|
||||
# Create a .env file (never commit this!)
|
||||
EMAIL_USERNAME=your-smtp-username@gmail.com
|
||||
EMAIL_PASSWORD=your-smtp-app-password
|
||||
```
|
||||
|
||||
The `annotatedyaml` library automatically loads values marked with `!secret` from environment variables.
|
||||
|
||||
### Gmail Configuration
|
||||
|
||||
If using Gmail, you need to:
|
||||
|
||||
1. Enable 2-factor authentication on your Google account
|
||||
2. Generate an "App Password" for SMTP access
|
||||
3. Use the app password as `EMAIL_PASSWORD`
|
||||
|
||||
**Gmail Settings:**
|
||||
```yaml
|
||||
smtp:
|
||||
host: "smtp.gmail.com"
|
||||
port: 587
|
||||
use_tls: true
|
||||
use_ssl: false
|
||||
```
|
||||
|
||||
### Other SMTP Providers
|
||||
|
||||
**SendGrid:**
|
||||
```yaml
|
||||
smtp:
|
||||
host: "smtp.sendgrid.net"
|
||||
port: 587
|
||||
username: "apikey"
|
||||
password: !secret SENDGRID_API_KEY
|
||||
use_tls: true
|
||||
```
|
||||
|
||||
**AWS SES:**
|
||||
```yaml
|
||||
smtp:
|
||||
host: "email-smtp.us-east-1.amazonaws.com"
|
||||
port: 587
|
||||
username: !secret AWS_SES_USERNAME
|
||||
password: !secret AWS_SES_PASSWORD
|
||||
use_tls: true
|
||||
```
|
||||
|
||||
## Usage
|
||||
|
||||
### Automatic Error Monitoring
|
||||
|
||||
Once configured, the system automatically captures all `ERROR` and `CRITICAL` log messages:
|
||||
|
||||
```python
|
||||
from alpine_bits_python.logging_config import get_logger
|
||||
|
||||
_LOGGER = get_logger(__name__)
|
||||
|
||||
# This error will be captured and sent via email
|
||||
_LOGGER.error("Database connection failed")
|
||||
|
||||
# This will also be captured
|
||||
try:
|
||||
risky_operation()
|
||||
except Exception:
|
||||
_LOGGER.exception("Operation failed") # Includes stack trace
|
||||
```
|
||||
|
||||
### Triggering Test Alerts
|
||||
|
||||
To test your email configuration, you can manually trigger errors:
|
||||
|
||||
```python
|
||||
import logging
|
||||
|
||||
_LOGGER = logging.getLogger(__name__)
|
||||
|
||||
# Generate multiple errors to trigger immediate alert (if threshold = 5)
|
||||
for i in range(5):
|
||||
_LOGGER.error(f"Test error {i + 1}")
|
||||
```
|
||||
|
||||
### Daily Report Statistics
|
||||
|
||||
To include custom statistics in daily reports, set a stats collector function:
|
||||
|
||||
```python
|
||||
async def collect_stats():
|
||||
"""Collect application statistics for daily report."""
|
||||
return {
|
||||
"total_reservations": await count_reservations(),
|
||||
"new_customers": await count_new_customers(),
|
||||
"active_hotels": await count_active_hotels(),
|
||||
"api_requests": get_request_count(),
|
||||
}
|
||||
|
||||
# Register the collector
|
||||
report_scheduler = app.state.report_scheduler
|
||||
if report_scheduler:
|
||||
report_scheduler.set_stats_collector(collect_stats)
|
||||
```
|
||||
|
||||
## Email Templates
|
||||
|
||||
### Error Alert Email
|
||||
|
||||
**Subject:** 🚨 AlpineBits Error Alert: 5 errors (threshold exceeded)
|
||||
|
||||
**Body:**
|
||||
```
|
||||
Error Alert - 2025-10-15 14:30:45
|
||||
======================================================================
|
||||
|
||||
Alert Type: Immediate Alert
|
||||
Error Count: 5
|
||||
Time Range: 14:25:00 to 14:30:00
|
||||
Reason: (threshold of 5 exceeded)
|
||||
|
||||
======================================================================
|
||||
|
||||
Errors:
|
||||
----------------------------------------------------------------------
|
||||
|
||||
[2025-10-15 14:25:12] ERROR: Database connection timeout
|
||||
Module: db:245 (alpine_bits_python.db)
|
||||
|
||||
[2025-10-15 14:26:34] ERROR: Failed to process reservation
|
||||
Module: api:567 (alpine_bits_python.api)
|
||||
Exception:
|
||||
Traceback (most recent call last):
|
||||
...
|
||||
|
||||
----------------------------------------------------------------------
|
||||
Generated by AlpineBits Email Monitoring at 2025-10-15 14:30:45
|
||||
```
|
||||
|
||||
### Daily Report Email
|
||||
|
||||
**Subject:** AlpineBits Daily Report - 2025-10-15
|
||||
|
||||
**Body (HTML):**
|
||||
```html
|
||||
AlpineBits Daily Report
|
||||
Date: 2025-10-15
|
||||
|
||||
Statistics
|
||||
┌────────────────────────┬────────┐
|
||||
│ Metric │ Value │
|
||||
├────────────────────────┼────────┤
|
||||
│ total_reservations │ 42 │
|
||||
│ new_customers │ 15 │
|
||||
│ active_hotels │ 4 │
|
||||
│ api_requests │ 1,234 │
|
||||
└────────────────────────┴────────┘
|
||||
|
||||
Errors (3)
|
||||
┌──────────────┬──────────┬─────────────────────────┐
|
||||
│ Time │ Level │ Message │
|
||||
├──────────────┼──────────┼─────────────────────────┤
|
||||
│ 08:15:23 │ ERROR │ Connection timeout │
|
||||
│ 12:45:10 │ ERROR │ Invalid form data │
|
||||
│ 18:30:00 │ CRITICAL │ Database unavailable │
|
||||
└──────────────┴──────────┴─────────────────────────┘
|
||||
|
||||
Generated by AlpineBits Server
|
||||
```
|
||||
|
||||
## Monitoring and Troubleshooting
|
||||
|
||||
### Check Email Configuration
|
||||
|
||||
```python
|
||||
from alpine_bits_python.email_service import create_email_service
|
||||
from alpine_bits_python.config_loader import load_config
|
||||
|
||||
config = load_config()
|
||||
email_service = create_email_service(config)
|
||||
|
||||
if email_service:
|
||||
print("✓ Email service configured")
|
||||
else:
|
||||
print("✗ Email service not configured")
|
||||
```
|
||||
|
||||
### Test Email Sending
|
||||
|
||||
```python
|
||||
import asyncio
|
||||
from alpine_bits_python.email_service import EmailService, EmailConfig
|
||||
|
||||
async def test_email():
|
||||
config = EmailConfig({
|
||||
"smtp": {
|
||||
"host": "smtp.gmail.com",
|
||||
"port": 587,
|
||||
"username": "your-email@gmail.com",
|
||||
"password": "your-app-password",
|
||||
"use_tls": True,
|
||||
},
|
||||
"from_address": "sender@example.com",
|
||||
"from_name": "Test",
|
||||
})
|
||||
|
||||
service = EmailService(config)
|
||||
|
||||
result = await service.send_email(
|
||||
recipients=["recipient@example.com"],
|
||||
subject="Test Email",
|
||||
body="This is a test email from AlpineBits server.",
|
||||
)
|
||||
|
||||
if result:
|
||||
print("✓ Email sent successfully")
|
||||
else:
|
||||
print("✗ Email sending failed")
|
||||
|
||||
asyncio.run(test_email())
|
||||
```
|
||||
|
||||
### Common Issues
|
||||
|
||||
**Issue: "Authentication failed"**
|
||||
- Verify SMTP username and password are correct
|
||||
- For Gmail, ensure you're using an App Password, not your regular password
|
||||
- Check that 2FA is enabled on Gmail
|
||||
|
||||
**Issue: "Connection timeout"**
|
||||
- Verify SMTP host and port are correct
|
||||
- Check firewall rules allow outbound SMTP connections
|
||||
- Try using port 465 with SSL instead of 587 with TLS
|
||||
|
||||
**Issue: "No email alerts received"**
|
||||
- Check that `enabled: true` in config
|
||||
- Verify recipient email addresses are correct
|
||||
- Check application logs for email sending errors
|
||||
- Ensure errors are being logged at ERROR or CRITICAL level
|
||||
|
||||
**Issue: "Too many emails being sent"**
|
||||
- Increase `cooldown_minutes` to reduce alert frequency
|
||||
- Increase `buffer_minutes` to batch more errors together
|
||||
- Increase `error_threshold` to only alert on serious issues
|
||||
|
||||
## Performance Considerations
|
||||
|
||||
### SMTP is Blocking
|
||||
|
||||
Email sending uses the standard Python `smtplib`, which performs blocking I/O. To prevent blocking the async event loop:
|
||||
|
||||
- Email operations are automatically run in a thread pool executor
|
||||
- This happens transparently via `loop.run_in_executor()`
|
||||
- No performance impact on request handling
|
||||
|
||||
### Memory Usage
|
||||
|
||||
- Error buffer size is limited by `buffer_minutes` duration
|
||||
- Old errors are automatically cleared after sending
|
||||
- Daily report error log is cleared after each report
|
||||
- Typical memory usage: <1 MB for error buffering
|
||||
|
||||
### Error Handling
|
||||
|
||||
- Email sending failures are logged but never crash the application
|
||||
- If SMTP is unavailable, errors are logged to console/file as normal
|
||||
- The logging handler has exception safety - it will never cause application failures
|
||||
|
||||
## Security Considerations
|
||||
|
||||
1. **Never commit credentials to git**
|
||||
- Use `!secret` annotation in YAML
|
||||
- Store credentials in environment variables
|
||||
- Add `.env` to `.gitignore`
|
||||
|
||||
2. **Use TLS/SSL encryption**
|
||||
- Always set `use_tls: true` or `use_ssl: true`
|
||||
- Never send credentials in plaintext
|
||||
|
||||
3. **Limit email recipients**
|
||||
- Only send alerts to authorized personnel
|
||||
- Use dedicated monitoring email addresses
|
||||
- Consider using distribution lists
|
||||
|
||||
4. **Sensitive data in logs**
|
||||
- Be careful not to log passwords, API keys, or PII
|
||||
- Error messages in emails may contain sensitive context
|
||||
- Review log messages before enabling email alerts
|
||||
|
||||
## Testing
|
||||
|
||||
Run the test suite:
|
||||
|
||||
```bash
|
||||
# Test email service only
|
||||
uv run pytest tests/test_email_service.py -v
|
||||
|
||||
# Test with coverage
|
||||
uv run pytest tests/test_email_service.py --cov=alpine_bits_python.email_service --cov=alpine_bits_python.email_monitoring
|
||||
```
|
||||
|
||||
## Future Enhancements
|
||||
|
||||
Potential improvements for future versions:
|
||||
|
||||
- [ ] Support for email templates (Jinja2)
|
||||
- [ ] Configurable retry logic for failed sends
|
||||
- [ ] Email queuing for high-volume scenarios
|
||||
- [ ] Integration with external monitoring services (PagerDuty, Slack)
|
||||
- [ ] Weekly/monthly report options
|
||||
- [ ] Custom alert rules based on error patterns
|
||||
- [ ] Email attachments for detailed logs
|
||||
- [ ] HTML email styling improvements
|
||||
|
||||
## References
|
||||
|
||||
- [Python smtplib Documentation](https://docs.python.org/3/library/smtplib.html)
|
||||
- [Python logging Documentation](https://docs.python.org/3/library/logging.html)
|
||||
- [Gmail SMTP Settings](https://support.google.com/mail/answer/7126229)
|
||||
- [annotatedyaml Documentation](https://github.com/yourusername/annotatedyaml)
|
||||
301
docs/EMAIL_MONITORING_IMPLEMENTATION.md
Normal file
301
docs/EMAIL_MONITORING_IMPLEMENTATION.md
Normal file
@@ -0,0 +1,301 @@
|
||||
# Email Monitoring Implementation Summary
|
||||
|
||||
## Overview
|
||||
|
||||
Successfully implemented a comprehensive email monitoring and alerting system for the AlpineBits Python server with proper configuration schema validation.
|
||||
|
||||
## Implementation Completed
|
||||
|
||||
### 1. Core Components ✅
|
||||
|
||||
- **[email_service.py](../src/alpine_bits_python/email_service.py)** - SMTP email service with TLS/SSL support
|
||||
- **[email_monitoring.py](../src/alpine_bits_python/email_monitoring.py)** - Logging integration with hybrid alert strategy
|
||||
- **[logging_config.py](../src/alpine_bits_python/logging_config.py)** - Integration with existing logging system
|
||||
- **[api.py](../src/alpine_bits_python/api.py)** - Lifecycle management (startup/shutdown)
|
||||
- **[config_loader.py](../src/alpine_bits_python/config_loader.py)** - **Schema validation for email config** ✅
|
||||
|
||||
### 2. Configuration Schema ✅
|
||||
|
||||
Added comprehensive Voluptuous schemas to `config_loader.py`:
|
||||
|
||||
```python
|
||||
# SMTP configuration
|
||||
smtp_schema = Schema({
|
||||
Required("host", default="localhost"): str,
|
||||
Required("port", default=587): Range(min=1, max=65535),
|
||||
Optional("username"): str,
|
||||
Optional("password"): str,
|
||||
Required("use_tls", default=True): Boolean(),
|
||||
Required("use_ssl", default=False): Boolean(),
|
||||
})
|
||||
|
||||
# Error alerts configuration
|
||||
error_alerts_schema = Schema({
|
||||
Required("enabled", default=False): Boolean(),
|
||||
Optional("recipients", default=[]): [str],
|
||||
Required("error_threshold", default=5): Range(min=1),
|
||||
Required("buffer_minutes", default=15): Range(min=1),
|
||||
Required("cooldown_minutes", default=15): Range(min=0),
|
||||
Required("log_levels", default=["ERROR", "CRITICAL"]): [
|
||||
In(["DEBUG", "INFO", "WARNING", "ERROR", "CRITICAL"])
|
||||
],
|
||||
})
|
||||
|
||||
# Daily report configuration
|
||||
daily_report_schema = Schema({
|
||||
Required("enabled", default=False): Boolean(),
|
||||
Optional("recipients", default=[]): [str],
|
||||
Required("send_time", default="08:00"): str,
|
||||
Required("include_stats", default=True): Boolean(),
|
||||
Required("include_errors", default=True): Boolean(),
|
||||
})
|
||||
```
|
||||
|
||||
**Benefits:**
|
||||
- ✅ Type validation (strings, integers, booleans, lists)
|
||||
- ✅ Range validation (port 1-65535, positive integers)
|
||||
- ✅ Enum validation (log levels must be valid)
|
||||
- ✅ Default values for all optional fields
|
||||
- ✅ Prevents typos and misconfigurations
|
||||
- ✅ Clear error messages when config is invalid
|
||||
|
||||
### 3. Configuration Files ✅
|
||||
|
||||
**[config/config.yaml](../config/config.yaml)** - Email configuration (currently disabled by default):
|
||||
```yaml
|
||||
email:
|
||||
smtp:
|
||||
host: "smtp.gmail.com"
|
||||
port: 587
|
||||
username: !secret EMAIL_USERNAME
|
||||
password: !secret EMAIL_PASSWORD
|
||||
use_tls: true
|
||||
from_address: "noreply@99tales.com"
|
||||
from_name: "AlpineBits Monitor"
|
||||
monitoring:
|
||||
error_alerts:
|
||||
enabled: false # Set to true to enable
|
||||
recipients: ["alerts@99tales.com"]
|
||||
error_threshold: 5
|
||||
buffer_minutes: 15
|
||||
cooldown_minutes: 15
|
||||
daily_report:
|
||||
enabled: false # Set to true to enable
|
||||
recipients: ["admin@99tales.com"]
|
||||
send_time: "08:00"
|
||||
```
|
||||
|
||||
**[config/.env.example](../config/.env.example)** - Template for environment variables
|
||||
**[config/secrets.yaml](../config/secrets.yaml)** - Secret values (not committed to git)
|
||||
|
||||
### 4. Testing ✅
|
||||
|
||||
**[tests/test_email_service.py](../tests/test_email_service.py)** - Comprehensive test suite (17 tests, all passing)
|
||||
|
||||
Test coverage:
|
||||
- ✅ EmailConfig initialization and defaults
|
||||
- ✅ Email sending (plain text and HTML)
|
||||
- ✅ Error record creation and formatting
|
||||
- ✅ EmailAlertHandler buffering and thresholds
|
||||
- ✅ DailyReportScheduler initialization and scheduling
|
||||
- ✅ Config schema validation
|
||||
|
||||
**[examples/test_email_monitoring.py](../examples/test_email_monitoring.py)** - Interactive test script
|
||||
|
||||
### 5. Documentation ✅
|
||||
|
||||
- **[EMAIL_MONITORING.md](./EMAIL_MONITORING.md)** - Complete documentation
|
||||
- **[EMAIL_MONITORING_QUICKSTART.md](./EMAIL_MONITORING_QUICKSTART.md)** - Quick start guide
|
||||
- **[EMAIL_MONITORING_IMPLEMENTATION.md](./EMAIL_MONITORING_IMPLEMENTATION.md)** - This document
|
||||
|
||||
## Key Features
|
||||
|
||||
### Hybrid Alert Strategy
|
||||
|
||||
The system uses a smart hybrid approach that balances responsiveness with spam prevention:
|
||||
|
||||
1. **Immediate Alerts** - When error threshold is reached (e.g., 5 errors), send alert immediately
|
||||
2. **Buffered Alerts** - Otherwise, accumulate errors and send after buffer time (e.g., 15 minutes)
|
||||
3. **Cooldown Period** - After sending, wait before sending another alert to prevent spam
|
||||
|
||||
### Automatic Integration
|
||||
|
||||
- **Zero Code Changes Required** - All existing `logger.error()` calls automatically trigger email alerts
|
||||
- **Non-Blocking** - SMTP operations run in thread pool, won't block async requests
|
||||
- **Thread-Safe** - Works correctly in multi-threaded async environment
|
||||
- **Production Ready** - Proper error handling, never crashes the application
|
||||
|
||||
### Schema Validation
|
||||
|
||||
The Voluptuous schema ensures:
|
||||
- ✅ All config values are valid before the app starts
|
||||
- ✅ Clear error messages for misconfigurations
|
||||
- ✅ Sensible defaults for optional values
|
||||
- ✅ Type safety (no runtime type errors)
|
||||
- ✅ PREVENT_EXTRA prevents typos in config keys
|
||||
|
||||
## Testing Results
|
||||
|
||||
### Schema Validation Test
|
||||
```bash
|
||||
✅ Config loaded successfully
|
||||
✅ Email config found
|
||||
SMTP host: smtp.gmail.com
|
||||
SMTP port: 587
|
||||
From: noreply@99tales.com
|
||||
From name: AlpineBits Monitor
|
||||
Error alerts enabled: False
|
||||
Error threshold: 5
|
||||
Daily reports enabled: False
|
||||
Send time: 08:00
|
||||
|
||||
✅ All schema validations passed!
|
||||
```
|
||||
|
||||
### Email Service Initialization Test
|
||||
```bash
|
||||
✅ Config loaded and validated by schema
|
||||
✅ Email service created successfully
|
||||
SMTP: smtp.gmail.com:587
|
||||
TLS: True
|
||||
From: AlpineBits Monitor <noreply@99tales.com>
|
||||
|
||||
🎉 Email monitoring is ready to use!
|
||||
```
|
||||
|
||||
### Unit Tests
|
||||
```bash
|
||||
============================= test session starts ==============================
|
||||
tests/test_email_service.py::TestEmailConfig::test_email_config_initialization PASSED
|
||||
tests/test_email_service.py::TestEmailConfig::test_email_config_defaults PASSED
|
||||
tests/test_email_service.py::TestEmailConfig::test_email_config_tls_ssl_conflict PASSED
|
||||
tests/test_email_service.py::TestEmailService::test_send_email_success PASSED
|
||||
tests/test_email_service.py::TestEmailService::test_send_email_no_recipients PASSED
|
||||
tests/test_email_service.py::TestEmailService::test_send_email_with_html PASSED
|
||||
tests/test_email_service.py::TestEmailService::test_send_alert PASSED
|
||||
tests/test_email_service.py::TestEmailService::test_send_daily_report PASSED
|
||||
tests/test_email_service.py::TestErrorRecord::test_error_record_creation PASSED
|
||||
tests/test_email_service.py::TestErrorRecord::test_error_record_to_dict PASSED
|
||||
tests/test_email_service.py::TestErrorRecord::test_error_record_format_plain_text PASSED
|
||||
tests/test_email_service.py::TestEmailAlertHandler::test_handler_initialization PASSED
|
||||
tests/test_email_service.py::TestEmailAlertHandler::test_handler_emit_below_threshold PASSED
|
||||
tests/test_email_service.py::TestEmailAlertHandler::test_handler_ignores_non_error_levels PASSED
|
||||
tests/test_email_service.py::TestDailyReportScheduler::test_scheduler_initialization PASSED
|
||||
tests/test_email_service.py::TestDailyReportScheduler::test_scheduler_log_error PASSED
|
||||
tests/test_email_service.py::TestDailyReportScheduler::test_scheduler_set_stats_collector PASSED
|
||||
|
||||
================= 17 passed, 1 warning in 0.11s ==================
|
||||
```
|
||||
|
||||
### Regression Tests
|
||||
```bash
|
||||
✅ All existing API tests still pass
|
||||
✅ No breaking changes to existing functionality
|
||||
```
|
||||
|
||||
## Usage
|
||||
|
||||
### To Enable Email Monitoring:
|
||||
|
||||
1. **Add SMTP credentials** to `config/secrets.yaml`:
|
||||
```yaml
|
||||
EMAIL_USERNAME: your-email@gmail.com
|
||||
EMAIL_PASSWORD: your-app-password
|
||||
```
|
||||
|
||||
2. **Enable features** in `config/config.yaml`:
|
||||
```yaml
|
||||
email:
|
||||
monitoring:
|
||||
error_alerts:
|
||||
enabled: true # Enable error alerts
|
||||
daily_report:
|
||||
enabled: true # Enable daily reports
|
||||
```
|
||||
|
||||
3. **Restart the server** - Email monitoring will start automatically
|
||||
|
||||
### To Test Email Monitoring:
|
||||
|
||||
```bash
|
||||
# Run the interactive test suite
|
||||
uv run python examples/test_email_monitoring.py
|
||||
```
|
||||
|
||||
This will:
|
||||
1. Send a test email
|
||||
2. Trigger an error alert by exceeding the threshold
|
||||
3. Trigger a buffered alert by waiting for buffer time
|
||||
4. Send a test daily report
|
||||
|
||||
## Architecture Decisions
|
||||
|
||||
### Why Voluptuous Schema Validation?
|
||||
|
||||
The project already uses Voluptuous for config validation, so we:
|
||||
- ✅ Maintained consistency with existing codebase
|
||||
- ✅ Leveraged existing validation patterns
|
||||
- ✅ Kept dependencies minimal (no new libraries needed)
|
||||
- ✅ Ensured config errors are caught at startup, not runtime
|
||||
|
||||
### Why Hybrid Alert Strategy?
|
||||
|
||||
The hybrid approach (immediate + buffered) provides:
|
||||
- ✅ **Fast response** for critical issues (5+ errors = immediate alert)
|
||||
- ✅ **Spam prevention** for occasional errors (buffered alerts)
|
||||
- ✅ **Cooldown period** prevents alert fatigue
|
||||
- ✅ **Always sends** buffered errors (no minimum threshold for time-based flush)
|
||||
|
||||
### Why Custom Logging Handler?
|
||||
|
||||
Using a custom `logging.Handler` provides:
|
||||
- ✅ **Zero code changes** - automatically captures all error logs
|
||||
- ✅ **Clean separation** - monitoring logic separate from business logic
|
||||
- ✅ **Standard pattern** - follows Python logging best practices
|
||||
- ✅ **Easy to disable** - just remove handler from logger
|
||||
|
||||
## Files Changed/Created
|
||||
|
||||
### Created Files
|
||||
- `src/alpine_bits_python/email_service.py` (new)
|
||||
- `src/alpine_bits_python/email_monitoring.py` (new)
|
||||
- `tests/test_email_service.py` (new)
|
||||
- `examples/test_email_monitoring.py` (new)
|
||||
- `docs/EMAIL_MONITORING.md` (new)
|
||||
- `docs/EMAIL_MONITORING_QUICKSTART.md` (new)
|
||||
- `docs/EMAIL_MONITORING_IMPLEMENTATION.md` (new)
|
||||
- `config/.env.example` (new)
|
||||
|
||||
### Modified Files
|
||||
- `src/alpine_bits_python/logging_config.py` - Added email handler integration
|
||||
- `src/alpine_bits_python/api.py` - Added email service initialization
|
||||
- `src/alpine_bits_python/config_loader.py` - **Added email config schema validation** ✅
|
||||
- `config/config.yaml` - Added email configuration section
|
||||
|
||||
## Next Steps (Optional Enhancements)
|
||||
|
||||
Potential future improvements:
|
||||
- [ ] Email templates with Jinja2
|
||||
- [ ] Retry logic for failed email sends
|
||||
- [ ] Integration with Slack, PagerDuty, Discord
|
||||
- [ ] Weekly/monthly report options
|
||||
- [ ] Custom alert rules based on error patterns
|
||||
- [ ] Email queuing for high-volume scenarios
|
||||
- [ ] Attachments support for detailed logs
|
||||
- [ ] HTML email styling improvements
|
||||
- [ ] Health check endpoint showing email status
|
||||
|
||||
## Conclusion
|
||||
|
||||
✅ **Email monitoring system is complete and production-ready!**
|
||||
|
||||
The system provides:
|
||||
- Robust SMTP email sending with TLS/SSL support
|
||||
- Intelligent error alerting with hybrid threshold + time-based approach
|
||||
- Scheduled daily reports with statistics and error summaries
|
||||
- Comprehensive schema validation using Voluptuous
|
||||
- Full test coverage with 17 passing tests
|
||||
- Complete documentation and quick start guides
|
||||
- Zero impact on existing functionality
|
||||
|
||||
**The system is ready to use!** Just configure SMTP credentials and enable the desired features.
|
||||
177
docs/EMAIL_MONITORING_QUICKSTART.md
Normal file
177
docs/EMAIL_MONITORING_QUICKSTART.md
Normal file
@@ -0,0 +1,177 @@
|
||||
# Email Monitoring Quick Start
|
||||
|
||||
Get email notifications for errors and daily reports in 5 minutes.
|
||||
|
||||
## 1. Configure SMTP Settings
|
||||
|
||||
Edit `config/config.yaml` and add:
|
||||
|
||||
```yaml
|
||||
email:
|
||||
smtp:
|
||||
host: "smtp.gmail.com"
|
||||
port: 587
|
||||
username: !secret EMAIL_USERNAME
|
||||
password: !secret EMAIL_PASSWORD
|
||||
use_tls: true
|
||||
from_address: "noreply@yourdomain.com"
|
||||
from_name: "AlpineBits Monitor"
|
||||
```
|
||||
|
||||
## 2. Set Environment Variables
|
||||
|
||||
In the secrets.yaml file add the secrets
|
||||
|
||||
```yaml
|
||||
EMAIL_USERNAME: "your_email_username"
|
||||
EMAIL_PASSWORD: "your_email_password"
|
||||
```
|
||||
|
||||
> **Note:** For Gmail, use an [App Password](https://support.google.com/accounts/answer/185833), not your regular password.
|
||||
|
||||
## 3. Enable Error Alerts
|
||||
|
||||
In `config/config.yaml`:
|
||||
|
||||
```yaml
|
||||
email:
|
||||
monitoring:
|
||||
error_alerts:
|
||||
enabled: true
|
||||
recipients:
|
||||
- "alerts@yourdomain.com"
|
||||
error_threshold: 5
|
||||
buffer_minutes: 15
|
||||
cooldown_minutes: 15
|
||||
```
|
||||
|
||||
**How it works:**
|
||||
|
||||
- Sends immediate alert after 5 errors
|
||||
- Otherwise sends after 15 minutes
|
||||
- Waits 15 minutes between alerts (cooldown)
|
||||
|
||||
## 4. Enable Daily Reports (Optional)
|
||||
|
||||
In `config/config.yaml`:
|
||||
|
||||
```yaml
|
||||
email:
|
||||
monitoring:
|
||||
daily_report:
|
||||
enabled: true
|
||||
recipients:
|
||||
- "admin@yourdomain.com"
|
||||
send_time: "08:00"
|
||||
include_stats: true
|
||||
include_errors: true
|
||||
```
|
||||
|
||||
## 5. Test Your Configuration
|
||||
|
||||
Run the test script:
|
||||
|
||||
```bash
|
||||
uv run python examples/test_email_monitoring.py
|
||||
```
|
||||
|
||||
This will:
|
||||
|
||||
- ✅ Send a test email
|
||||
- ✅ Trigger an error alert
|
||||
- ✅ Send a test daily report
|
||||
|
||||
## What You Get
|
||||
|
||||
### Error Alert Email
|
||||
|
||||
When errors occur, you'll receive:
|
||||
|
||||
```
|
||||
🚨 AlpineBits Error Alert: 5 errors (threshold exceeded)
|
||||
|
||||
Error Count: 5
|
||||
Time Range: 14:25:00 to 14:30:00
|
||||
|
||||
Errors:
|
||||
----------------------------------------------------------------------
|
||||
[2025-10-15 14:25:12] ERROR: Database connection timeout
|
||||
Module: db:245
|
||||
|
||||
[2025-10-15 14:26:34] ERROR: Failed to process reservation
|
||||
Module: api:567
|
||||
Exception: ValueError: Invalid hotel code
|
||||
```
|
||||
|
||||
### Daily Report Email
|
||||
|
||||
Every day at 8 AM, you'll receive:
|
||||
|
||||
```
|
||||
📊 AlpineBits Daily Report - 2025-10-15
|
||||
|
||||
Statistics:
|
||||
total_reservations: 42
|
||||
new_customers: 15
|
||||
active_hotels: 4
|
||||
|
||||
Errors (3):
|
||||
[08:15:23] ERROR: Connection timeout
|
||||
[12:45:10] ERROR: Invalid form data
|
||||
[18:30:00] CRITICAL: Database unavailable
|
||||
```
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### No emails received?
|
||||
|
||||
1. Check your SMTP credentials:
|
||||
|
||||
```bash
|
||||
echo $EMAIL_USERNAME
|
||||
echo $EMAIL_PASSWORD
|
||||
```
|
||||
|
||||
2. Check application logs for errors:
|
||||
|
||||
```bash
|
||||
tail -f alpinebits.log | grep -i email
|
||||
```
|
||||
|
||||
3. Test SMTP connection manually:
|
||||
```bash
|
||||
uv run python -c "
|
||||
import smtplib
|
||||
with smtplib.SMTP('smtp.gmail.com', 587) as smtp:
|
||||
smtp.starttls()
|
||||
smtp.login('$EMAIL_USERNAME', '$EMAIL_PASSWORD')
|
||||
print('✅ SMTP connection successful')
|
||||
"
|
||||
```
|
||||
|
||||
### Gmail authentication failed?
|
||||
|
||||
- Enable 2-factor authentication on your Google account
|
||||
- Generate an App Password at https://myaccount.google.com/apppasswords
|
||||
- Use the App Password (not your regular password)
|
||||
|
||||
### Too many emails?
|
||||
|
||||
- Increase `error_threshold` to only alert on serious issues
|
||||
- Increase `buffer_minutes` to batch more errors together
|
||||
- Increase `cooldown_minutes` to reduce alert frequency
|
||||
|
||||
## Next Steps
|
||||
|
||||
- Read the full [Email Monitoring Documentation](./EMAIL_MONITORING.md)
|
||||
- Configure custom statistics for daily reports
|
||||
- Set up multiple recipient groups
|
||||
- Integrate with Slack or PagerDuty (coming soon)
|
||||
|
||||
## Support
|
||||
|
||||
For issues or questions:
|
||||
|
||||
- Check the [documentation](./EMAIL_MONITORING.md)
|
||||
- Review [test examples](../examples/test_email_monitoring.py)
|
||||
- Open an issue on GitHub
|
||||
297
docs/MULTI_WORKER_DEPLOYMENT.md
Normal file
297
docs/MULTI_WORKER_DEPLOYMENT.md
Normal file
@@ -0,0 +1,297 @@
|
||||
# Multi-Worker Deployment Guide
|
||||
|
||||
## Problem Statement
|
||||
|
||||
When running FastAPI with multiple workers (e.g., `uvicorn app:app --workers 4`), the `lifespan` function runs in **every worker process**. This causes singleton services to run multiple times:
|
||||
|
||||
- ❌ **Email schedulers** send duplicate notifications (4x emails if 4 workers)
|
||||
- ❌ **Background tasks** run redundantly across all workers
|
||||
- ❌ **Database migrations/hashing** may cause race conditions
|
||||
|
||||
## Solution: File-Based Worker Coordination
|
||||
|
||||
We use **file-based locking** to ensure only ONE worker runs singleton services. This approach:
|
||||
|
||||
- ✅ Works across different process managers (uvicorn, gunicorn, systemd)
|
||||
- ✅ No external dependencies (Redis, databases)
|
||||
- ✅ Automatic failover (if primary worker crashes, another can acquire lock)
|
||||
- ✅ Simple and reliable
|
||||
|
||||
## Implementation
|
||||
|
||||
### 1. Worker Coordination Module
|
||||
|
||||
The `worker_coordination.py` module provides:
|
||||
|
||||
```python
|
||||
from alpine_bits_python.worker_coordination import is_primary_worker
|
||||
|
||||
# In your lifespan function
|
||||
is_primary, worker_lock = is_primary_worker()
|
||||
|
||||
if is_primary:
|
||||
# Start schedulers, background tasks, etc.
|
||||
start_email_scheduler()
|
||||
else:
|
||||
# This is a secondary worker - skip singleton services
|
||||
pass
|
||||
```
|
||||
|
||||
### 2. How It Works
|
||||
|
||||
```
|
||||
┌─────────────────────────────────────────────────────┐
|
||||
│ uvicorn --workers 4 │
|
||||
└─────────────────────────────────────────────────────┘
|
||||
│
|
||||
├─── Worker 0 (PID 1001) ─┐
|
||||
├─── Worker 1 (PID 1002) ─┤
|
||||
├─── Worker 2 (PID 1003) ─┤ All try to acquire
|
||||
└─── Worker 3 (PID 1004) ─┘ /tmp/alpinebits_primary_worker.lock
|
||||
|
||||
│
|
||||
▼
|
||||
|
||||
Worker 0: ✓ Lock acquired → PRIMARY
|
||||
Worker 1: ✗ Lock busy → SECONDARY
|
||||
Worker 2: ✗ Lock busy → SECONDARY
|
||||
Worker 3: ✗ Lock busy → SECONDARY
|
||||
```
|
||||
|
||||
### 3. Lifespan Function
|
||||
|
||||
```python
|
||||
async def lifespan(app: FastAPI):
|
||||
# Determine primary worker using file lock
|
||||
is_primary, worker_lock = is_primary_worker()
|
||||
|
||||
_LOGGER.info("Worker startup: pid=%d, primary=%s", os.getpid(), is_primary)
|
||||
|
||||
# All workers: shared setup
|
||||
config = load_config()
|
||||
engine = create_async_engine(DATABASE_URL)
|
||||
|
||||
# Only primary worker: singleton services
|
||||
if is_primary:
|
||||
# Start email scheduler
|
||||
email_handler, report_scheduler = setup_logging(
|
||||
config, email_service, loop, enable_scheduler=True
|
||||
)
|
||||
report_scheduler.start()
|
||||
|
||||
# Run database migrations/hashing
|
||||
await hash_existing_customers()
|
||||
else:
|
||||
# Secondary workers: skip schedulers
|
||||
email_handler, report_scheduler = setup_logging(
|
||||
config, email_service, loop, enable_scheduler=False
|
||||
)
|
||||
|
||||
yield
|
||||
|
||||
# Cleanup
|
||||
if report_scheduler:
|
||||
report_scheduler.stop()
|
||||
|
||||
# Release lock
|
||||
if worker_lock:
|
||||
worker_lock.release()
|
||||
```
|
||||
|
||||
## Deployment Scenarios
|
||||
|
||||
### Development (Single Worker)
|
||||
|
||||
```bash
|
||||
# No special configuration needed
|
||||
uvicorn alpine_bits_python.api:app --reload
|
||||
```
|
||||
|
||||
Result: Single worker becomes primary automatically.
|
||||
|
||||
### Production (Multiple Workers)
|
||||
|
||||
```bash
|
||||
# 4 workers for handling concurrent requests
|
||||
uvicorn alpine_bits_python.api:app --workers 4 --host 0.0.0.0 --port 8000
|
||||
```
|
||||
|
||||
Result:
|
||||
- Worker 0 becomes PRIMARY → runs schedulers
|
||||
- Workers 1-3 are SECONDARY → handle requests only
|
||||
|
||||
### With Gunicorn
|
||||
|
||||
```bash
|
||||
gunicorn alpine_bits_python.api:app \
|
||||
--workers 4 \
|
||||
--worker-class uvicorn.workers.UvicornWorker \
|
||||
--bind 0.0.0.0:8000
|
||||
```
|
||||
|
||||
Result: Same as uvicorn - one primary, rest secondary.
|
||||
|
||||
### Docker Compose
|
||||
|
||||
```yaml
|
||||
services:
|
||||
api:
|
||||
image: alpinebits-api
|
||||
command: uvicorn alpine_bits_python.api:app --workers 4 --host 0.0.0.0
|
||||
volumes:
|
||||
- /tmp:/tmp # Important: Share lock file location
|
||||
```
|
||||
|
||||
**Important**: When using multiple containers, ensure they share the same lock file location or use Redis-based coordination instead.
|
||||
|
||||
## Monitoring & Debugging
|
||||
|
||||
### Check Which Worker is Primary
|
||||
|
||||
Look for log messages at startup:
|
||||
|
||||
```
|
||||
Worker startup: pid=1001, primary=True
|
||||
Worker startup: pid=1002, primary=False
|
||||
Worker startup: pid=1003, primary=False
|
||||
Worker startup: pid=1004, primary=False
|
||||
```
|
||||
|
||||
### Check Lock File
|
||||
|
||||
```bash
|
||||
# See which PID holds the lock
|
||||
cat /tmp/alpinebits_primary_worker.lock
|
||||
# Output: 1001
|
||||
|
||||
# Verify process is running
|
||||
ps aux | grep 1001
|
||||
```
|
||||
|
||||
### Testing Worker Coordination
|
||||
|
||||
Run the test script:
|
||||
|
||||
```bash
|
||||
uv run python test_worker_coordination.py
|
||||
```
|
||||
|
||||
Expected output:
|
||||
```
|
||||
Worker 0 (PID 30773): ✓ I am PRIMARY
|
||||
Worker 1 (PID 30774): ✗ I am SECONDARY
|
||||
Worker 2 (PID 30775): ✗ I am SECONDARY
|
||||
Worker 3 (PID 30776): ✗ I am SECONDARY
|
||||
```
|
||||
|
||||
## Failover Behavior
|
||||
|
||||
### Primary Worker Crashes
|
||||
|
||||
1. Primary worker holds lock
|
||||
2. Primary worker crashes/exits → lock is automatically released by OS
|
||||
3. Existing secondary workers remain secondary (they already failed to acquire lock)
|
||||
4. **Next restart**: First worker to start becomes new primary
|
||||
|
||||
### Graceful Restart
|
||||
|
||||
1. Send SIGTERM to workers
|
||||
2. Primary worker releases lock in shutdown
|
||||
3. New workers start, one becomes primary
|
||||
|
||||
## Lock File Location
|
||||
|
||||
Default: `/tmp/alpinebits_primary_worker.lock`
|
||||
|
||||
### Change Lock Location
|
||||
|
||||
```python
|
||||
from alpine_bits_python.worker_coordination import WorkerLock
|
||||
|
||||
# Custom location
|
||||
lock = WorkerLock("/var/run/alpinebits/primary.lock")
|
||||
is_primary = lock.acquire()
|
||||
```
|
||||
|
||||
**Production recommendation**: Use `/var/run/` or `/run/` for lock files (automatically cleaned on reboot).
|
||||
|
||||
## Common Issues
|
||||
|
||||
### Issue: All workers think they're primary
|
||||
|
||||
**Cause**: Lock file path not accessible or workers running in separate containers.
|
||||
|
||||
**Solution**:
|
||||
- Check file permissions on lock directory
|
||||
- For containers: Use shared volume or Redis-based coordination
|
||||
|
||||
### Issue: No worker becomes primary
|
||||
|
||||
**Cause**: Lock file from previous run still exists.
|
||||
|
||||
**Solution**:
|
||||
```bash
|
||||
# Clean up stale lock file
|
||||
rm /tmp/alpinebits_primary_worker.lock
|
||||
# Restart application
|
||||
```
|
||||
|
||||
### Issue: Duplicate emails still being sent
|
||||
|
||||
**Cause**: Email handler running on all workers (not just schedulers).
|
||||
|
||||
**Solution**: Email **alert handler** runs on all workers (to catch errors from any worker). Email **scheduler** only runs on primary. This is correct behavior - alerts come from any worker, scheduled reports only from primary.
|
||||
|
||||
## Alternative Approaches
|
||||
|
||||
### Redis-Based Coordination
|
||||
|
||||
For multi-container deployments, consider Redis-based locks:
|
||||
|
||||
```python
|
||||
import redis
|
||||
from redis.lock import Lock
|
||||
|
||||
redis_client = redis.Redis(host='redis', port=6379)
|
||||
lock = Lock(redis_client, "alpinebits_primary_worker", timeout=60)
|
||||
|
||||
if lock.acquire(blocking=False):
|
||||
# This is the primary worker
|
||||
start_schedulers()
|
||||
```
|
||||
|
||||
**Pros**: Works across containers
|
||||
**Cons**: Requires Redis dependency
|
||||
|
||||
### Environment Variable (Not Recommended)
|
||||
|
||||
```bash
|
||||
# Manually set primary worker
|
||||
ALPINEBITS_PRIMARY_WORKER=true uvicorn app:app
|
||||
```
|
||||
|
||||
**Pros**: Simple
|
||||
**Cons**: Manual configuration, no automatic failover
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. ✅ **Use file locks for single-host deployments** (our implementation)
|
||||
2. ✅ **Use Redis locks for multi-container deployments**
|
||||
3. ✅ **Log primary/secondary status at startup**
|
||||
4. ✅ **Always release locks on shutdown**
|
||||
5. ✅ **Keep lock files in `/var/run/` or `/tmp/`**
|
||||
6. ❌ **Don't rely on process names** (unreliable with uvicorn)
|
||||
7. ❌ **Don't use environment variables** (no automatic failover)
|
||||
8. ❌ **Don't skip coordination** (will cause duplicate notifications)
|
||||
|
||||
## Summary
|
||||
|
||||
With file-based worker coordination:
|
||||
|
||||
- ✅ Only ONE worker runs singleton services (schedulers, migrations)
|
||||
- ✅ All workers handle HTTP requests normally
|
||||
- ✅ Automatic failover if primary worker crashes
|
||||
- ✅ No external dependencies needed
|
||||
- ✅ Works with uvicorn, gunicorn, and other ASGI servers
|
||||
|
||||
This ensures you get the benefits of multiple workers (concurrency) without duplicate email notifications or race conditions.
|
||||
154
docs/architecture_diagram.txt
Normal file
154
docs/architecture_diagram.txt
Normal file
@@ -0,0 +1,154 @@
|
||||
╔══════════════════════════════════════════════════════════════════════════════╗
|
||||
║ MULTI-WORKER FASTAPI ARCHITECTURE ║
|
||||
╚══════════════════════════════════════════════════════════════════════════════╝
|
||||
|
||||
┌─────────────────────────────────────────────────────────────────────────────┐
|
||||
│ Command: uvicorn alpine_bits_python.api:app --workers 4 │
|
||||
└─────────────────────────────────────────────────────────────────────────────┘
|
||||
│
|
||||
▼
|
||||
┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓
|
||||
┃ Master Process (uvicorn supervisor) ┃
|
||||
┗━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┛
|
||||
│ │ │ │
|
||||
┌───────────┼──────────┼──────────┼──────────┼───────────┐
|
||||
│ │ │ │ │ │
|
||||
▼ ▼ ▼ ▼ ▼ ▼
|
||||
┌────────┐ ┌────────┐ ┌────────┐ ┌────────┐ ┌──────────────────┐
|
||||
│Worker 0│ │Worker 1│ │Worker 2│ │Worker 3│ │Lock File │
|
||||
│PID:1001│ │PID:1002│ │PID:1003│ │PID:1004│ │/tmp/alpinebits │
|
||||
└────┬───┘ └───┬────┘ └───┬────┘ └───┬────┘ │_primary_worker │
|
||||
│ │ │ │ │.lock │
|
||||
│ │ │ │ └──────────────────┘
|
||||
│ │ │ │ ▲
|
||||
│ │ │ │ │
|
||||
└─────────┴──────────┴──────────┴─────────────┤
|
||||
All try to acquire lock │
|
||||
│ │
|
||||
▼ │
|
||||
┌───────────────────────┐ │
|
||||
│ fcntl.flock(LOCK_EX) │────────────┘
|
||||
│ Non-blocking attempt │
|
||||
└───────────────────────┘
|
||||
│
|
||||
┏━━━━━━━━━━━━━━━━┻━━━━━━━━━━━━━━━━┓
|
||||
▼ ▼
|
||||
┌─────────┐ ┌──────────────┐
|
||||
│SUCCESS │ │ WOULD BLOCK │
|
||||
│(First) │ │(Others) │
|
||||
└────┬────┘ └──────┬───────┘
|
||||
│ │
|
||||
▼ ▼
|
||||
|
||||
╔════════════════════════════════╗ ╔══════════════════════════════╗
|
||||
║ PRIMARY WORKER ║ ║ SECONDARY WORKERS ║
|
||||
║ (Worker 0, PID 1001) ║ ║ (Workers 1-3) ║
|
||||
╠════════════════════════════════╣ ╠══════════════════════════════╣
|
||||
║ ║ ║ ║
|
||||
║ ✓ Handle HTTP requests ║ ║ ✓ Handle HTTP requests ║
|
||||
║ ✓ Start email scheduler ║ ║ ✗ Skip email scheduler ║
|
||||
║ ✓ Send daily reports ║ ║ ✗ Skip daily reports ║
|
||||
║ ✓ Run DB migrations ║ ║ ✗ Skip DB migrations ║
|
||||
║ ✓ Hash customers (startup) ║ ║ ✗ Skip customer hashing ║
|
||||
║ ✓ Send error alerts ║ ║ ✓ Send error alerts ║
|
||||
║ ✓ Process webhooks ║ ║ ✓ Process webhooks ║
|
||||
║ ✓ AlpineBits endpoints ║ ║ ✓ AlpineBits endpoints ║
|
||||
║ ║ ║ ║
|
||||
║ Holds: worker_lock ║ ║ worker_lock = None ║
|
||||
║ ║ ║ ║
|
||||
╚════════════════════════════════╝ ╚══════════════════════════════╝
|
||||
│ │
|
||||
│ │
|
||||
└──────────┬───────────────────────────┘
|
||||
│
|
||||
▼
|
||||
┌───────────────────────────┐
|
||||
│ Incoming HTTP Request │
|
||||
└───────────────────────────┘
|
||||
│
|
||||
(Load balanced by OS)
|
||||
│
|
||||
┌───────────┴──────────────┐
|
||||
│ │
|
||||
▼ ▼
|
||||
Any worker can handle Round-robin distribution
|
||||
the request normally across all 4 workers
|
||||
|
||||
|
||||
╔══════════════════════════════════════════════════════════════════════════════╗
|
||||
║ SINGLETON SERVICES ║
|
||||
╚══════════════════════════════════════════════════════════════════════════════╝
|
||||
|
||||
Only run on PRIMARY worker:
|
||||
|
||||
┌─────────────────────────────────────────────────────────────┐
|
||||
│ Email Scheduler │
|
||||
│ ├─ Daily Report: 8:00 AM │
|
||||
│ └─ Stats Collection: Per-hotel reservation counts │
|
||||
└─────────────────────────────────────────────────────────────┘
|
||||
|
||||
┌─────────────────────────────────────────────────────────────┐
|
||||
│ Startup Tasks (One-time) │
|
||||
│ ├─ Database table creation │
|
||||
│ ├─ Customer data hashing/backfill │
|
||||
│ └─ Configuration validation │
|
||||
└─────────────────────────────────────────────────────────────┘
|
||||
|
||||
|
||||
╔══════════════════════════════════════════════════════════════════════════════╗
|
||||
║ SHARED SERVICES ║
|
||||
╚══════════════════════════════════════════════════════════════════════════════╝
|
||||
|
||||
Run on ALL workers (primary + secondary):
|
||||
|
||||
┌─────────────────────────────────────────────────────────────┐
|
||||
│ HTTP Request Handling │
|
||||
│ ├─ Webhook endpoints (/api/webhook/*) │
|
||||
│ ├─ AlpineBits endpoints (/api/alpinebits/*) │
|
||||
│ └─ Health checks (/api/health) │
|
||||
└─────────────────────────────────────────────────────────────┘
|
||||
|
||||
┌─────────────────────────────────────────────────────────────┐
|
||||
│ Error Alert Handler │
|
||||
│ └─ Any worker can send immediate error alerts │
|
||||
└─────────────────────────────────────────────────────────────┘
|
||||
|
||||
┌─────────────────────────────────────────────────────────────┐
|
||||
│ Event Dispatching │
|
||||
│ └─ Background tasks triggered by webhooks │
|
||||
└─────────────────────────────────────────────────────────────┘
|
||||
|
||||
|
||||
╔══════════════════════════════════════════════════════════════════════════════╗
|
||||
║ SHUTDOWN & FAILOVER ║
|
||||
╚══════════════════════════════════════════════════════════════════════════════╝
|
||||
|
||||
Graceful Shutdown:
|
||||
┌─────────────────────────────────────────────────────────────┐
|
||||
│ 1. SIGTERM received │
|
||||
│ 2. Stop scheduler (primary only) │
|
||||
│ 3. Close email handler │
|
||||
│ 4. Release worker_lock (primary only) │
|
||||
│ 5. Dispose database engine │
|
||||
└─────────────────────────────────────────────────────────────┘
|
||||
|
||||
Primary Worker Crash:
|
||||
┌─────────────────────────────────────────────────────────────┐
|
||||
│ 1. Primary worker crashes │
|
||||
│ 2. OS automatically releases file lock │
|
||||
│ 3. Secondary workers continue handling requests │
|
||||
│ 4. On next restart, first worker becomes new primary │
|
||||
└─────────────────────────────────────────────────────────────┘
|
||||
|
||||
|
||||
╔══════════════════════════════════════════════════════════════════════════════╗
|
||||
║ KEY BENEFITS ║
|
||||
╚══════════════════════════════════════════════════════════════════════════════╝
|
||||
|
||||
✓ No duplicate email notifications
|
||||
✓ No race conditions in database operations
|
||||
✓ Automatic failover if primary crashes
|
||||
✓ Load distribution for HTTP requests
|
||||
✓ No external dependencies (Redis, etc.)
|
||||
✓ Simple and reliable
|
||||
|
||||
1
examples/Reservierungen_bemelman_20251117_064824.xml
Normal file
1
examples/Reservierungen_bemelman_20251117_064824.xml
Normal file
File diff suppressed because one or more lines are too long
5
examples/Reservierungen_sebastian_20251021_115750.xml
Normal file
5
examples/Reservierungen_sebastian_20251021_115750.xml
Normal file
@@ -0,0 +1,5 @@
|
||||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<reservations>
|
||||
<Deletedreservation ID="2473" />
|
||||
<Deletedreservation ID="2475" />
|
||||
</reservations>
|
||||
42
examples/Reservierungen_sebastian_20251022_055346.xml
Normal file
42
examples/Reservierungen_sebastian_20251022_055346.xml
Normal file
@@ -0,0 +1,42 @@
|
||||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<reservations>
|
||||
<reservation hotelID="135" id="2498" number="240" date="2025-10-21"
|
||||
creationTime="2025-10-21T14:03:24" type="reservation" bookingChannel="WHO_KNOWS_WHO_KNOWS"
|
||||
advertisingMedium="99TALES" advertisingPartner="cpc" advertisingCampagne="IwAR123fbclid456">
|
||||
<guest id="380" lastName="Schmidt" firstName="Maria" language="de" gender="female"
|
||||
email="maria.schmidt@gmail.com" />
|
||||
<roomReservations>
|
||||
<roomReservation arrival="2025-11-15" departure="2025-11-18" status="reserved"
|
||||
roomType="EZ" roomNumber="106" adults="1" ratePlanCode="STD" connectedRoomType="0">
|
||||
<dailySales>
|
||||
<dailySale date="2025-11-15" revenueTotal="165" revenueLogis="140.2"
|
||||
revenueBoard="9" revenueFB="10" revenueSpa="1" revenueOther="4.8" />
|
||||
<dailySale date="2025-11-16" revenueTotal="165" revenueLogis="140.2"
|
||||
revenueBoard="9" revenueFB="10" revenueSpa="1" revenueOther="4.8" />
|
||||
<dailySale date="2025-11-17" revenueTotal="165" revenueLogis="140.2"
|
||||
revenueBoard="9" revenueFB="10" revenueSpa="1" revenueOther="4.8" />
|
||||
<dailySale date="2025-11-18" />
|
||||
</dailySales>
|
||||
</roomReservation>
|
||||
</roomReservations>
|
||||
</reservation>
|
||||
<reservation hotelID="135" id="2499" number="241" date="2025-10-21"
|
||||
creationTime="2025-10-21T14:04:26" type="reservation" bookingChannel="WHO_KNOWS_WHO_KNOWS"
|
||||
advertisingMedium="99TALES" advertisingPartner="website"
|
||||
advertisingCampagne="nduaitreuditaor">
|
||||
<guest id="381" lastName="Linter" firstName="Jonas" language="de" gender="male"
|
||||
email="jonas@vaius.ai" />
|
||||
<roomReservations>
|
||||
<roomReservation arrival="2025-10-28" departure="2025-10-30" status="reserved"
|
||||
roomType="DZ" roomNumber="101" adults="2" connectedRoomType="0">
|
||||
<dailySales>
|
||||
<dailySale date="2025-10-28" revenueTotal="474" revenueLogis="372.16"
|
||||
revenueBoard="67.96" revenueFB="20" revenueSpa="2" revenueOther="11.88" />
|
||||
<dailySale date="2025-10-29" revenueTotal="474" revenueLogis="372.16"
|
||||
revenueBoard="67.96" revenueFB="20" revenueSpa="2" revenueOther="11.88" />
|
||||
<dailySale date="2025-10-30" />
|
||||
</dailySales>
|
||||
</roomReservation>
|
||||
</roomReservations>
|
||||
</reservation>
|
||||
</reservations>
|
||||
964315
examples/formatted_reservierungen.xml
Normal file
964315
examples/formatted_reservierungen.xml
Normal file
File diff suppressed because it is too large
Load Diff
305
examples/test_email_monitoring.py
Normal file
305
examples/test_email_monitoring.py
Normal file
@@ -0,0 +1,305 @@
|
||||
"""Example script to test email monitoring functionality.
|
||||
|
||||
This script demonstrates how to:
|
||||
1. Configure the email service
|
||||
2. Send test emails
|
||||
3. Trigger error alerts
|
||||
4. Test daily report generation
|
||||
|
||||
Usage:
|
||||
uv run python examples/test_email_monitoring.py
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
import logging
|
||||
from datetime import datetime
|
||||
|
||||
from alpine_bits_python.config_loader import load_config
|
||||
from alpine_bits_python.email_monitoring import (
|
||||
DailyReportScheduler,
|
||||
EmailAlertHandler,
|
||||
)
|
||||
from alpine_bits_python.email_service import create_email_service
|
||||
from alpine_bits_python.logging_config import get_logger, setup_logging
|
||||
|
||||
_LOGGER = get_logger(__name__)
|
||||
|
||||
|
||||
async def test_basic_email():
|
||||
"""Test 1: Send a basic test email."""
|
||||
print("\n" + "=" * 60)
|
||||
print("Test 1: Basic Email Sending")
|
||||
print("=" * 60)
|
||||
|
||||
config = load_config()
|
||||
email_service = create_email_service(config)
|
||||
|
||||
if not email_service:
|
||||
print("❌ Email service not configured. Check your config.yaml")
|
||||
return False
|
||||
|
||||
print("✓ Email service initialized")
|
||||
|
||||
# Get the first recipient from error_alerts config
|
||||
email_config = config.get("email", {})
|
||||
monitoring_config = email_config.get("monitoring", {})
|
||||
error_alerts_config = monitoring_config.get("error_alerts", {})
|
||||
recipients = error_alerts_config.get("recipients", [])
|
||||
|
||||
if not recipients:
|
||||
print("❌ No recipients configured in error_alerts")
|
||||
return False
|
||||
|
||||
print(f"✓ Sending test email to: {recipients[0]}")
|
||||
|
||||
success = await email_service.send_email(
|
||||
recipients=[recipients[0]],
|
||||
subject="AlpineBits Email Test - Basic",
|
||||
body=f"""This is a test email from the AlpineBits server.
|
||||
|
||||
Timestamp: {datetime.now().isoformat()}
|
||||
Test: Basic email sending
|
||||
|
||||
If you received this email, your SMTP configuration is working correctly!
|
||||
|
||||
---
|
||||
AlpineBits Python Server
|
||||
Email Monitoring System
|
||||
""",
|
||||
)
|
||||
|
||||
if success:
|
||||
print("✅ Test email sent successfully!")
|
||||
return True
|
||||
else:
|
||||
print("❌ Failed to send test email. Check logs for details.")
|
||||
return False
|
||||
|
||||
|
||||
async def test_error_alert_threshold():
|
||||
"""Test 2: Trigger immediate error alert by exceeding threshold."""
|
||||
print("\n" + "=" * 60)
|
||||
print("Test 2: Error Alert - Threshold Trigger")
|
||||
print("=" * 60)
|
||||
|
||||
config = load_config()
|
||||
email_service = create_email_service(config)
|
||||
|
||||
if not email_service:
|
||||
print("❌ Email service not configured")
|
||||
return False
|
||||
|
||||
# Setup logging with email monitoring
|
||||
loop = asyncio.get_running_loop()
|
||||
email_handler, _ = setup_logging(config, email_service, loop)
|
||||
|
||||
if not email_handler:
|
||||
print("❌ Error alert handler not configured")
|
||||
return False
|
||||
|
||||
print(f"✓ Error alert handler configured (threshold: {email_handler.error_threshold})")
|
||||
print(f" Recipients: {email_handler.recipients}")
|
||||
|
||||
# Generate errors to exceed threshold
|
||||
threshold = email_handler.error_threshold
|
||||
print(f"\n📨 Generating {threshold} errors to trigger immediate alert...")
|
||||
|
||||
logger = logging.getLogger("test.error.threshold")
|
||||
for i in range(threshold):
|
||||
logger.error(f"Test error #{i + 1} - Threshold test at {datetime.now().isoformat()}")
|
||||
print(f" → Error {i + 1}/{threshold} logged")
|
||||
await asyncio.sleep(0.1) # Small delay between errors
|
||||
|
||||
# Wait a bit for email to be sent
|
||||
print("\n⏳ Waiting for alert email to be sent...")
|
||||
await asyncio.sleep(3)
|
||||
|
||||
print("✅ Threshold test complete! Check your email for the alert.")
|
||||
return True
|
||||
|
||||
|
||||
async def test_error_alert_buffer():
|
||||
"""Test 3: Trigger buffered error alert by waiting for buffer time."""
|
||||
print("\n" + "=" * 60)
|
||||
print("Test 3: Error Alert - Buffer Time Trigger")
|
||||
print("=" * 60)
|
||||
|
||||
config = load_config()
|
||||
email_service = create_email_service(config)
|
||||
|
||||
if not email_service:
|
||||
print("❌ Email service not configured")
|
||||
return False
|
||||
|
||||
# Setup logging with email monitoring
|
||||
loop = asyncio.get_running_loop()
|
||||
email_handler, _ = setup_logging(config, email_service, loop)
|
||||
|
||||
if not email_handler:
|
||||
print("❌ Error alert handler not configured")
|
||||
return False
|
||||
|
||||
print(f"✓ Error alert handler configured (buffer: {email_handler.buffer_minutes} minutes)")
|
||||
|
||||
# Generate fewer errors than threshold
|
||||
num_errors = max(1, email_handler.error_threshold - 2)
|
||||
print(f"\n📨 Generating {num_errors} errors (below threshold)...")
|
||||
|
||||
logger = logging.getLogger("test.error.buffer")
|
||||
for i in range(num_errors):
|
||||
logger.error(f"Test error #{i + 1} - Buffer test at {datetime.now().isoformat()}")
|
||||
print(f" → Error {i + 1}/{num_errors} logged")
|
||||
|
||||
buffer_seconds = email_handler.buffer_minutes * 60
|
||||
print(f"\n⏳ Waiting {email_handler.buffer_minutes} minute(s) for buffer to flush...")
|
||||
print(" (This will send an email with all buffered errors)")
|
||||
|
||||
# Wait for buffer time + a bit extra
|
||||
await asyncio.sleep(buffer_seconds + 2)
|
||||
|
||||
print("✅ Buffer test complete! Check your email for the alert.")
|
||||
return True
|
||||
|
||||
|
||||
async def test_daily_report():
|
||||
"""Test 4: Generate and send a test daily report."""
|
||||
print("\n" + "=" * 60)
|
||||
print("Test 4: Daily Report")
|
||||
print("=" * 60)
|
||||
|
||||
config = load_config()
|
||||
email_service = create_email_service(config)
|
||||
|
||||
if not email_service:
|
||||
print("❌ Email service not configured")
|
||||
return False
|
||||
|
||||
# Create a daily report scheduler
|
||||
daily_report_config = (
|
||||
config.get("email", {})
|
||||
.get("monitoring", {})
|
||||
.get("daily_report", {})
|
||||
)
|
||||
|
||||
if not daily_report_config.get("enabled"):
|
||||
print("⚠️ Daily reports not enabled in config")
|
||||
print(" Set email.monitoring.daily_report.enabled = true")
|
||||
return False
|
||||
|
||||
scheduler = DailyReportScheduler(email_service, daily_report_config)
|
||||
print(f"✓ Daily report scheduler configured")
|
||||
print(f" Recipients: {scheduler.recipients}")
|
||||
print(f" Send time: {scheduler.send_time}")
|
||||
|
||||
# Add some test statistics
|
||||
test_stats = {
|
||||
"total_reservations": 42,
|
||||
"new_customers": 15,
|
||||
"active_hotels": 4,
|
||||
"api_requests_today": 1234,
|
||||
"average_response_time_ms": 45,
|
||||
"success_rate": "99.2%",
|
||||
}
|
||||
|
||||
# Add some test errors
|
||||
test_errors = [
|
||||
{
|
||||
"timestamp": "2025-10-15 08:15:23",
|
||||
"level": "ERROR",
|
||||
"message": "Connection timeout to external API",
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-10-15 12:45:10",
|
||||
"level": "ERROR",
|
||||
"message": "Invalid form data submitted",
|
||||
},
|
||||
{
|
||||
"timestamp": "2025-10-15 18:30:00",
|
||||
"level": "CRITICAL",
|
||||
"message": "Database connection pool exhausted",
|
||||
},
|
||||
]
|
||||
|
||||
print("\n📊 Sending test daily report...")
|
||||
print(f" Stats: {len(test_stats)} metrics")
|
||||
print(f" Errors: {len(test_errors)} entries")
|
||||
|
||||
success = await email_service.send_daily_report(
|
||||
recipients=scheduler.recipients,
|
||||
stats=test_stats,
|
||||
errors=test_errors,
|
||||
)
|
||||
|
||||
if success:
|
||||
print("✅ Daily report sent successfully!")
|
||||
return True
|
||||
else:
|
||||
print("❌ Failed to send daily report. Check logs for details.")
|
||||
return False
|
||||
|
||||
|
||||
async def run_all_tests():
|
||||
"""Run all email monitoring tests."""
|
||||
print("\n" + "=" * 60)
|
||||
print("AlpineBits Email Monitoring Test Suite")
|
||||
print("=" * 60)
|
||||
|
||||
tests = [
|
||||
("Basic Email", test_basic_email),
|
||||
("Error Alert (Threshold)", test_error_alert_threshold),
|
||||
("Error Alert (Buffer)", test_error_alert_buffer),
|
||||
("Daily Report", test_daily_report),
|
||||
]
|
||||
|
||||
results = []
|
||||
|
||||
for test_name, test_func in tests:
|
||||
try:
|
||||
result = await test_func()
|
||||
results.append((test_name, result))
|
||||
except Exception as e:
|
||||
print(f"\n❌ Test '{test_name}' failed with exception: {e}")
|
||||
results.append((test_name, False))
|
||||
|
||||
# Wait between tests to avoid rate limiting
|
||||
await asyncio.sleep(2)
|
||||
|
||||
# Print summary
|
||||
print("\n" + "=" * 60)
|
||||
print("Test Summary")
|
||||
print("=" * 60)
|
||||
|
||||
passed = sum(1 for _, result in results if result)
|
||||
total = len(results)
|
||||
|
||||
for test_name, result in results:
|
||||
status = "✅ PASS" if result else "❌ FAIL"
|
||||
print(f"{status}: {test_name}")
|
||||
|
||||
print(f"\nTotal: {passed}/{total} tests passed")
|
||||
|
||||
if passed == total:
|
||||
print("\n🎉 All tests passed!")
|
||||
else:
|
||||
print(f"\n⚠️ {total - passed} test(s) failed")
|
||||
|
||||
|
||||
def main():
|
||||
"""Main entry point."""
|
||||
print("Starting email monitoring tests...")
|
||||
print("Make sure you have configured email settings in config.yaml")
|
||||
print("and set EMAIL_USERNAME and EMAIL_PASSWORD environment variables.")
|
||||
|
||||
# Run the tests
|
||||
try:
|
||||
asyncio.run(run_all_tests())
|
||||
except KeyboardInterrupt:
|
||||
print("\n\n⚠️ Tests interrupted by user")
|
||||
except Exception as e:
|
||||
print(f"\n\n❌ Fatal error: {e}")
|
||||
import traceback
|
||||
traceback.print_exc()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
46
format_xml.py
Normal file
46
format_xml.py
Normal file
@@ -0,0 +1,46 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Format a large XML file for readability."""
|
||||
|
||||
import xml.dom.minidom
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
def format_xml(input_path, output_path=None):
|
||||
"""Format XML file with proper indentation."""
|
||||
input_file = Path(input_path)
|
||||
|
||||
if not input_file.exists():
|
||||
print(f"Error: File {input_path} not found", file=sys.stderr)
|
||||
sys.exit(1)
|
||||
|
||||
print(f"Reading {input_file.name}...", file=sys.stderr)
|
||||
with open(input_file, 'r', encoding='utf-8') as f:
|
||||
xml_content = f.read()
|
||||
|
||||
print("Parsing XML...", file=sys.stderr)
|
||||
dom = xml.dom.minidom.parseString(xml_content)
|
||||
|
||||
print("Formatting XML...", file=sys.stderr)
|
||||
pretty_xml = dom.toprettyxml(indent=" ")
|
||||
|
||||
# Remove extra blank lines that toprettyxml adds
|
||||
pretty_xml = "\n".join([line for line in pretty_xml.split("\n") if line.strip()])
|
||||
|
||||
if output_path is None:
|
||||
output_path = input_file.with_stem(input_file.stem + "_formatted")
|
||||
|
||||
print(f"Writing formatted XML to {output_path}...", file=sys.stderr)
|
||||
with open(output_path, 'w', encoding='utf-8') as f:
|
||||
f.write(pretty_xml)
|
||||
|
||||
print(f"Done! Formatted XML saved to {output_path}", file=sys.stderr)
|
||||
|
||||
if __name__ == "__main__":
|
||||
if len(sys.argv) < 2:
|
||||
print("Usage: python format_xml.py <input_file> [output_file]", file=sys.stderr)
|
||||
sys.exit(1)
|
||||
|
||||
input_file = sys.argv[1]
|
||||
output_file = sys.argv[2] if len(sys.argv) > 2 else None
|
||||
|
||||
format_xml(input_file, output_file)
|
||||
@@ -11,11 +11,14 @@ requires-python = ">=3.13"
|
||||
dependencies = [
|
||||
"aiosqlite>=0.21.0",
|
||||
"annotatedyaml>=1.0.0",
|
||||
"asyncpg>=0.30.0",
|
||||
"dotenv>=0.9.9",
|
||||
"fast-langdetect>=1.0.0",
|
||||
"fastapi>=0.117.1",
|
||||
"generateds>=2.44.3",
|
||||
"httpx>=0.28.1",
|
||||
"lxml>=6.0.1",
|
||||
"pushover-complete>=2.0.0",
|
||||
"pydantic[email]>=2.11.9",
|
||||
"pytest>=8.4.2",
|
||||
"pytest-asyncio>=1.2.0",
|
||||
|
||||
@@ -1,9 +1,12 @@
|
||||
import re
|
||||
import traceback
|
||||
from dataclasses import dataclass
|
||||
from datetime import UTC
|
||||
from enum import Enum
|
||||
from typing import Any
|
||||
|
||||
from email_validator import EmailNotValidError, validate_email
|
||||
|
||||
from alpine_bits_python.db import Customer, Reservation
|
||||
from alpine_bits_python.logging_config import get_logger
|
||||
from alpine_bits_python.schemas import (
|
||||
@@ -22,6 +25,7 @@ from .generated.alpinebits import (
|
||||
OtaHotelResNotifRq,
|
||||
OtaResRetrieveRs,
|
||||
ProfileProfileType,
|
||||
RoomTypeRoomType,
|
||||
UniqueIdType2,
|
||||
)
|
||||
|
||||
@@ -73,6 +77,13 @@ RetrieveRoomStays = OtaResRetrieveRs.ReservationsList.HotelReservation.RoomStays
|
||||
NotifHotelReservation = OtaHotelResNotifRq.HotelReservations.HotelReservation
|
||||
RetrieveHotelReservation = OtaResRetrieveRs.ReservationsList.HotelReservation
|
||||
|
||||
NotifRoomTypes = (
|
||||
OtaHotelResNotifRq.HotelReservations.HotelReservation.RoomStays.RoomStay.RoomTypes
|
||||
)
|
||||
RetrieveRoomTypes = (
|
||||
OtaResRetrieveRs.ReservationsList.HotelReservation.RoomStays.RoomStay.RoomTypes
|
||||
)
|
||||
|
||||
from .const import RESERVATION_ID_TYPE
|
||||
|
||||
|
||||
@@ -618,6 +629,20 @@ def create_res_notif_push_message(
|
||||
return _create_xml_from_db(list, OtaMessageType.NOTIF, config)
|
||||
|
||||
|
||||
def _validate_and_repair_email(email: str | None) -> str | None:
|
||||
if email is None:
|
||||
return None
|
||||
try:
|
||||
# remove numbers from top-level domain (TLD) if any
|
||||
email = re.sub(r"\.\d+", ".", email)
|
||||
|
||||
email_info = validate_email(email)
|
||||
except EmailNotValidError as e:
|
||||
_LOGGER.warning("invalid email address: %s", e)
|
||||
return None
|
||||
return email_info.normalized
|
||||
|
||||
|
||||
def _process_single_reservation(
|
||||
reservation: Reservation,
|
||||
customer: Customer,
|
||||
@@ -628,13 +653,16 @@ def _process_single_reservation(
|
||||
[(customer.phone, PhoneTechType.MOBILE)] if customer.phone is not None else []
|
||||
)
|
||||
|
||||
# Validate and repair email address
|
||||
email = _validate_and_repair_email(customer.email_address)
|
||||
|
||||
customer_data = CustomerData(
|
||||
given_name=customer.given_name,
|
||||
surname=customer.surname,
|
||||
name_prefix=customer.name_prefix,
|
||||
name_title=customer.name_title,
|
||||
phone_numbers=phone_numbers,
|
||||
email_address=customer.email_address,
|
||||
email_address=email,
|
||||
email_newsletter=customer.email_newsletter,
|
||||
address_line=customer.address_line,
|
||||
city_name=customer.city_name,
|
||||
@@ -677,9 +705,29 @@ def _process_single_reservation(
|
||||
start=reservation.start_date.isoformat() if reservation.start_date else None,
|
||||
end=reservation.end_date.isoformat() if reservation.end_date else None,
|
||||
)
|
||||
|
||||
# RoomTypes (optional) - only create if at least one field is present
|
||||
room_types = None
|
||||
if any([reservation.room_type_code, reservation.room_classification_code, reservation.room_type]):
|
||||
# Convert room_type string to enum if present
|
||||
room_type_enum = None
|
||||
if reservation.room_type:
|
||||
room_type_enum = RoomTypeRoomType(reservation.room_type)
|
||||
|
||||
# Create RoomType instance
|
||||
room_type_obj = RoomStays.RoomStay.RoomTypes.RoomType(
|
||||
room_type_code=reservation.room_type_code,
|
||||
room_classification_code=reservation.room_classification_code,
|
||||
room_type=room_type_enum,
|
||||
)
|
||||
|
||||
# Create RoomTypes container
|
||||
room_types = RoomStays.RoomStay.RoomTypes(room_type=room_type_obj)
|
||||
|
||||
room_stay = RoomStays.RoomStay(
|
||||
time_span=time_span,
|
||||
guest_counts=guest_counts,
|
||||
room_types=room_types,
|
||||
)
|
||||
room_stays = RoomStays(
|
||||
room_stay=[room_stay],
|
||||
|
||||
@@ -11,11 +11,9 @@ import re
|
||||
from abc import ABC
|
||||
from dataclasses import dataclass
|
||||
from datetime import datetime
|
||||
from enum import Enum, IntEnum
|
||||
from enum import Enum
|
||||
from typing import Any, Optional, override
|
||||
from zoneinfo import ZoneInfo
|
||||
|
||||
from sqlalchemy import select
|
||||
from xsdata.formats.dataclass.serializers.config import SerializerConfig
|
||||
from xsdata_pydantic.bindings import XmlParser, XmlSerializer
|
||||
|
||||
@@ -25,7 +23,8 @@ from alpine_bits_python.alpine_bits_helpers import (
|
||||
)
|
||||
from alpine_bits_python.logging_config import get_logger
|
||||
|
||||
from .db import AckedRequest, Customer, Reservation
|
||||
from .const import HttpStatusCode
|
||||
from .db import Customer, Reservation
|
||||
from .generated.alpinebits import (
|
||||
OtaNotifReportRq,
|
||||
OtaNotifReportRs,
|
||||
@@ -34,20 +33,12 @@ from .generated.alpinebits import (
|
||||
OtaReadRq,
|
||||
WarningStatus,
|
||||
)
|
||||
from .reservation_service import ReservationService
|
||||
|
||||
# Configure logging
|
||||
_LOGGER = get_logger(__name__)
|
||||
|
||||
|
||||
class HttpStatusCode(IntEnum):
|
||||
"""Allowed HTTP status codes for AlpineBits responses."""
|
||||
|
||||
OK = 200
|
||||
BAD_REQUEST = 400
|
||||
UNAUTHORIZED = 401
|
||||
INTERNAL_SERVER_ERROR = 500
|
||||
|
||||
|
||||
def dump_json_for_xml(json_content: Any) -> str:
|
||||
"""Dump JSON content as a pretty-printed string for embedding in XML.
|
||||
|
||||
@@ -147,7 +138,8 @@ class AlpineBitsResponse:
|
||||
"""Validate that status code is one of the allowed values."""
|
||||
if self.status_code not in [200, 400, 401, 500]:
|
||||
raise ValueError(
|
||||
f"Invalid status code {self.status_code}. Must be 200, 400, 401, or 500"
|
||||
"Invalid status code %s. Must be 200, 400, 401, or 500",
|
||||
self.status_code,
|
||||
)
|
||||
|
||||
|
||||
@@ -227,9 +219,9 @@ class ServerCapabilities:
|
||||
|
||||
def _is_action_implemented(self, action_class: type[AlpineBitsAction]) -> bool:
|
||||
"""Check if an action is actually implemented or just uses the default behavior.
|
||||
|
||||
This is a simple check - in practice, you might want more sophisticated detection.
|
||||
"""
|
||||
# Check if the class has overridden the handle method
|
||||
return "handle" in action_class.__dict__
|
||||
|
||||
def create_capabilities_dict(self) -> None:
|
||||
@@ -510,32 +502,31 @@ class ReadAction(AlpineBitsAction):
|
||||
hotel_read_request.selection_criteria.start
|
||||
)
|
||||
|
||||
# query all reservations for this hotel from the database, where start_date is greater than or equal to the given start_date
|
||||
# Use ReservationService to query reservations
|
||||
reservation_service = ReservationService(dbsession)
|
||||
|
||||
stmt = (
|
||||
select(Reservation, Customer)
|
||||
.join(Customer, Reservation.customer_id == Customer.id)
|
||||
.filter(Reservation.hotel_code == hotelid)
|
||||
)
|
||||
if start_date:
|
||||
_LOGGER.info("Filtering reservations from start date %s", start_date)
|
||||
stmt = stmt.filter(Reservation.created_at >= start_date)
|
||||
# remove reservations that have been acknowledged via client_id
|
||||
elif client_info.client_id:
|
||||
subquery = (
|
||||
select(Reservation.id)
|
||||
.join(
|
||||
AckedRequest,
|
||||
Reservation.md5_unique_id == AckedRequest.unique_id,
|
||||
reservation_customer_pairs = (
|
||||
await reservation_service.get_reservations_with_filters(
|
||||
start_date=start_date, hotel_code=hotelid
|
||||
)
|
||||
)
|
||||
elif client_info.username or client_info.client_id:
|
||||
# Remove reservations that have been acknowledged via username (preferred) or client_id
|
||||
reservation_customer_pairs = (
|
||||
await reservation_service.get_unacknowledged_reservations(
|
||||
username=client_info.username,
|
||||
client_id=client_info.client_id,
|
||||
hotel_code=hotelid
|
||||
)
|
||||
)
|
||||
else:
|
||||
reservation_customer_pairs = (
|
||||
await reservation_service.get_reservations_with_filters(
|
||||
hotel_code=hotelid
|
||||
)
|
||||
.filter(AckedRequest.client_id == client_info.client_id)
|
||||
)
|
||||
stmt = stmt.filter(~Reservation.id.in_(subquery))
|
||||
|
||||
result = await dbsession.execute(stmt)
|
||||
reservation_customer_pairs: list[tuple[Reservation, Customer]] = (
|
||||
result.all()
|
||||
) # List of (Reservation, Customer) tuples
|
||||
|
||||
_LOGGER.info(
|
||||
"Querying reservations and customers for hotel %s from database",
|
||||
@@ -616,19 +607,16 @@ class NotifReportReadAction(AlpineBitsAction):
|
||||
"Error: Something went wrong", HttpStatusCode.INTERNAL_SERVER_ERROR
|
||||
)
|
||||
|
||||
timestamp = datetime.now(ZoneInfo("UTC"))
|
||||
# Use ReservationService to record acknowledgements
|
||||
reservation_service = ReservationService(dbsession)
|
||||
|
||||
for entry in (
|
||||
notif_report_details.hotel_notif_report.hotel_reservations.hotel_reservation
|
||||
): # type: ignore
|
||||
unique_id = entry.unique_id.id
|
||||
acked_request = AckedRequest(
|
||||
unique_id=unique_id,
|
||||
client_id=client_info.client_id,
|
||||
timestamp=timestamp,
|
||||
md5_unique_id = entry.unique_id.id
|
||||
await reservation_service.record_acknowledgement(
|
||||
client_id=client_info.client_id, unique_id=md5_unique_id, username=client_info.username
|
||||
)
|
||||
dbsession.add(acked_request)
|
||||
|
||||
await dbsession.commit()
|
||||
|
||||
return AlpineBitsResponse(response_xml, HttpStatusCode.OK)
|
||||
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -6,9 +6,12 @@ from annotatedyaml.loader import load_yaml as load_annotated_yaml
|
||||
from voluptuous import (
|
||||
PREVENT_EXTRA,
|
||||
All,
|
||||
Boolean,
|
||||
In,
|
||||
Length,
|
||||
MultipleInvalid,
|
||||
Optional,
|
||||
Range,
|
||||
Required,
|
||||
Schema,
|
||||
)
|
||||
@@ -16,11 +19,13 @@ from voluptuous import (
|
||||
from alpine_bits_python.const import (
|
||||
CONF_ALPINE_BITS_AUTH,
|
||||
CONF_DATABASE,
|
||||
CONF_GOOGLE_ACCOUNT,
|
||||
CONF_HOTEL_ID,
|
||||
CONF_HOTEL_NAME,
|
||||
CONF_LOGGING,
|
||||
CONF_LOGGING_FILE,
|
||||
CONF_LOGGING_LEVEL,
|
||||
CONF_META_ACCOUNT,
|
||||
CONF_PASSWORD,
|
||||
CONF_PUSH_ENDPOINT,
|
||||
CONF_PUSH_TOKEN,
|
||||
@@ -36,7 +41,9 @@ from alpine_bits_python.const import (
|
||||
)
|
||||
|
||||
# --- Voluptuous schemas ---
|
||||
database_schema = Schema({Required("url"): str}, extra=PREVENT_EXTRA)
|
||||
database_schema = Schema(
|
||||
{Required("url"): str, Optional("schema"): str}, extra=PREVENT_EXTRA
|
||||
)
|
||||
|
||||
|
||||
logger_schema = Schema(
|
||||
@@ -71,6 +78,8 @@ hotel_auth_schema = Schema(
|
||||
Required(CONF_HOTEL_NAME): str,
|
||||
Required(CONF_USERNAME): str,
|
||||
Required(CONF_PASSWORD): str,
|
||||
Optional(CONF_META_ACCOUNT): str,
|
||||
Optional(CONF_GOOGLE_ACCOUNT): str,
|
||||
Optional(CONF_PUSH_ENDPOINT): {
|
||||
Required(CONF_PUSH_URL): str,
|
||||
Required(CONF_PUSH_TOKEN): str,
|
||||
@@ -82,12 +91,183 @@ hotel_auth_schema = Schema(
|
||||
|
||||
basic_auth_schema = Schema(All([hotel_auth_schema], Length(min=1)))
|
||||
|
||||
# Email SMTP configuration schema
|
||||
smtp_schema = Schema(
|
||||
{
|
||||
Required("host", default="localhost"): str,
|
||||
Required("port", default=587): Range(min=1, max=65535),
|
||||
Optional("username"): str,
|
||||
Optional("password"): str,
|
||||
Required("use_tls", default=True): Boolean(),
|
||||
Required("use_ssl", default=False): Boolean(),
|
||||
},
|
||||
extra=PREVENT_EXTRA,
|
||||
)
|
||||
|
||||
# Email daily report configuration schema
|
||||
daily_report_schema = Schema(
|
||||
{
|
||||
Required("enabled", default=False): Boolean(),
|
||||
Optional("recipients", default=[]): [str],
|
||||
Required("send_time", default="08:00"): str,
|
||||
Required("include_stats", default=True): Boolean(),
|
||||
Required("include_errors", default=True): Boolean(),
|
||||
},
|
||||
extra=PREVENT_EXTRA,
|
||||
)
|
||||
|
||||
# Email error alerts configuration schema
|
||||
error_alerts_schema = Schema(
|
||||
{
|
||||
Required("enabled", default=False): Boolean(),
|
||||
Optional("recipients", default=[]): [str],
|
||||
Required("error_threshold", default=5): Range(min=1),
|
||||
Required("buffer_minutes", default=15): Range(min=1),
|
||||
Required("cooldown_minutes", default=15): Range(min=0),
|
||||
Required("log_levels", default=["ERROR", "CRITICAL"]): [
|
||||
In(["DEBUG", "INFO", "WARNING", "ERROR", "CRITICAL"])
|
||||
],
|
||||
},
|
||||
extra=PREVENT_EXTRA,
|
||||
)
|
||||
|
||||
# Email monitoring configuration schema
|
||||
monitoring_schema = Schema(
|
||||
{
|
||||
Optional("daily_report", default={}): daily_report_schema,
|
||||
Optional("error_alerts", default={}): error_alerts_schema,
|
||||
},
|
||||
extra=PREVENT_EXTRA,
|
||||
)
|
||||
|
||||
# Complete email configuration schema
|
||||
email_schema = Schema(
|
||||
{
|
||||
Optional("smtp", default={}): smtp_schema,
|
||||
Required("from_address", default="noreply@example.com"): str,
|
||||
Required("from_name", default="AlpineBits Server"): str,
|
||||
Optional("timeout", default=10): Range(min=1, max=300),
|
||||
Optional("monitoring", default={}): monitoring_schema,
|
||||
},
|
||||
extra=PREVENT_EXTRA,
|
||||
)
|
||||
|
||||
# Pushover daily report configuration schema
|
||||
pushover_daily_report_schema = Schema(
|
||||
{
|
||||
Required("enabled", default=False): Boolean(),
|
||||
Required("send_time", default="08:00"): str,
|
||||
Required("include_stats", default=True): Boolean(),
|
||||
Required("include_errors", default=True): Boolean(),
|
||||
Required("priority", default=0): Range(
|
||||
min=-2, max=2
|
||||
), # Pushover priority levels
|
||||
},
|
||||
extra=PREVENT_EXTRA,
|
||||
)
|
||||
|
||||
# Pushover error alerts configuration schema
|
||||
pushover_error_alerts_schema = Schema(
|
||||
{
|
||||
Required("enabled", default=False): Boolean(),
|
||||
Required("error_threshold", default=5): Range(min=1),
|
||||
Required("buffer_minutes", default=15): Range(min=1),
|
||||
Required("cooldown_minutes", default=15): Range(min=0),
|
||||
Required("log_levels", default=["ERROR", "CRITICAL"]): [
|
||||
In(["DEBUG", "INFO", "WARNING", "ERROR", "CRITICAL"])
|
||||
],
|
||||
Required("priority", default=1): Range(
|
||||
min=-2, max=2
|
||||
), # Pushover priority levels
|
||||
},
|
||||
extra=PREVENT_EXTRA,
|
||||
)
|
||||
|
||||
# Pushover monitoring configuration schema
|
||||
pushover_monitoring_schema = Schema(
|
||||
{
|
||||
Optional("daily_report", default={}): pushover_daily_report_schema,
|
||||
Optional("error_alerts", default={}): pushover_error_alerts_schema,
|
||||
},
|
||||
extra=PREVENT_EXTRA,
|
||||
)
|
||||
|
||||
# Complete pushover configuration schema
|
||||
pushover_schema = Schema(
|
||||
{
|
||||
Optional("user_key"): str, # Optional but required for pushover to work
|
||||
Optional("api_token"): str, # Optional but required for pushover to work
|
||||
Optional("monitoring", default={}): pushover_monitoring_schema,
|
||||
},
|
||||
extra=PREVENT_EXTRA,
|
||||
)
|
||||
|
||||
# Unified notification method schema
|
||||
notification_method_schema = Schema(
|
||||
{
|
||||
Required("type"): In(["email", "pushover"]),
|
||||
Optional("address"): str, # For email
|
||||
Optional("priority"): Range(min=-2, max=2), # For pushover
|
||||
},
|
||||
extra=PREVENT_EXTRA,
|
||||
)
|
||||
|
||||
# Unified notification recipient schema
|
||||
notification_recipient_schema = Schema(
|
||||
{
|
||||
Required("name"): str,
|
||||
Required("methods"): [notification_method_schema],
|
||||
},
|
||||
extra=PREVENT_EXTRA,
|
||||
)
|
||||
|
||||
# Unified daily report configuration schema (without recipients)
|
||||
unified_daily_report_schema = Schema(
|
||||
{
|
||||
Required("enabled", default=False): Boolean(),
|
||||
Required("send_time", default="08:00"): str,
|
||||
Required("include_stats", default=True): Boolean(),
|
||||
Required("include_errors", default=True): Boolean(),
|
||||
},
|
||||
extra=PREVENT_EXTRA,
|
||||
)
|
||||
|
||||
# Unified error alerts configuration schema (without recipients)
|
||||
unified_error_alerts_schema = Schema(
|
||||
{
|
||||
Required("enabled", default=False): Boolean(),
|
||||
Required("error_threshold", default=5): Range(min=1),
|
||||
Required("buffer_minutes", default=15): Range(min=1),
|
||||
Required("cooldown_minutes", default=15): Range(min=0),
|
||||
Required("log_levels", default=["ERROR", "CRITICAL"]): [
|
||||
In(["DEBUG", "INFO", "WARNING", "ERROR", "CRITICAL"])
|
||||
],
|
||||
},
|
||||
extra=PREVENT_EXTRA,
|
||||
)
|
||||
|
||||
# Unified notifications configuration schema
|
||||
notifications_schema = Schema(
|
||||
{
|
||||
Required("recipients", default=[]): [notification_recipient_schema],
|
||||
Optional("daily_report", default={}): unified_daily_report_schema,
|
||||
Optional("error_alerts", default={}): unified_error_alerts_schema,
|
||||
},
|
||||
extra=PREVENT_EXTRA,
|
||||
)
|
||||
|
||||
config_schema = Schema(
|
||||
{
|
||||
Required(CONF_DATABASE): database_schema,
|
||||
Required(CONF_ALPINE_BITS_AUTH): basic_auth_schema,
|
||||
Required(CONF_SERVER): server_info,
|
||||
Required(CONF_LOGGING): logger_schema,
|
||||
Optional("email"): email_schema, # Email is optional (service config only)
|
||||
Optional(
|
||||
"pushover"
|
||||
): pushover_schema, # Pushover is optional (service config only)
|
||||
Optional("notifications"): notifications_schema, # Unified notification config
|
||||
Optional("api_tokens", default=[]): [str], # API tokens for bearer auth
|
||||
},
|
||||
extra=PREVENT_EXTRA,
|
||||
)
|
||||
@@ -96,6 +276,8 @@ DEFAULT_CONFIG_FILE = "config.yaml"
|
||||
|
||||
|
||||
class Config:
|
||||
"""Class to load and hold the configuration."""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
config_folder: str | Path | None = None,
|
||||
|
||||
@@ -1,5 +1,16 @@
|
||||
from enum import IntEnum
|
||||
from typing import Final
|
||||
|
||||
|
||||
class HttpStatusCode(IntEnum):
|
||||
"""Allowed HTTP status codes for AlpineBits responses."""
|
||||
|
||||
OK = 200
|
||||
BAD_REQUEST = 400
|
||||
UNAUTHORIZED = 401
|
||||
INTERNAL_SERVER_ERROR = 500
|
||||
|
||||
|
||||
RESERVATION_ID_TYPE: str = (
|
||||
"13" # Default reservation ID type for Reservation. 14 would be cancellation
|
||||
)
|
||||
@@ -26,6 +37,8 @@ CONF_HOTEL_ID: Final[str] = "hotel_id"
|
||||
CONF_HOTEL_NAME: Final[str] = "hotel_name"
|
||||
CONF_USERNAME: Final[str] = "username"
|
||||
CONF_PASSWORD: Final[str] = "password"
|
||||
CONF_META_ACCOUNT: Final[str] = "meta_account"
|
||||
CONF_GOOGLE_ACCOUNT: Final[str] = "google_account"
|
||||
CONF_PUSH_ENDPOINT: Final[str] = "push_endpoint"
|
||||
CONF_PUSH_URL: Final[str] = "url"
|
||||
CONF_PUSH_TOKEN: Final[str] = "token"
|
||||
|
||||
1074
src/alpine_bits_python/conversion_service.py
Normal file
1074
src/alpine_bits_python/conversion_service.py
Normal file
File diff suppressed because it is too large
Load Diff
294
src/alpine_bits_python/customer_service.py
Normal file
294
src/alpine_bits_python/customer_service.py
Normal file
@@ -0,0 +1,294 @@
|
||||
"""Customer service layer for handling customer and hashed customer operations."""
|
||||
|
||||
from datetime import UTC, datetime
|
||||
|
||||
from pydantic import ValidationError
|
||||
from sqlalchemy import select
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
|
||||
from .db import Customer, HashedCustomer
|
||||
from .logging_config import get_logger
|
||||
from .schemas import CustomerData
|
||||
|
||||
_LOGGER = get_logger(__name__)
|
||||
|
||||
|
||||
class CustomerService:
|
||||
"""Service for managing customers and their hashed versions.
|
||||
|
||||
Automatically maintains hashed customer data whenever customers are
|
||||
created or updated, ensuring data is always in sync for Meta Conversion API.
|
||||
"""
|
||||
|
||||
def __init__(self, session: AsyncSession):
|
||||
self.session = session
|
||||
|
||||
async def create_customer(self, customer_data: dict) -> Customer:
|
||||
"""Create a new customer and automatically create its hashed version.
|
||||
|
||||
Args:
|
||||
customer_data: Dictionary containing customer fields
|
||||
|
||||
Returns:
|
||||
The created Customer instance (with hashed_version relationship populated)
|
||||
|
||||
Raises:
|
||||
ValidationError: If customer_data fails validation
|
||||
(e.g., invalid country code)
|
||||
|
||||
"""
|
||||
# Validate customer data through Pydantic model
|
||||
validated_data = CustomerData(**customer_data)
|
||||
|
||||
# Create the customer with validated data
|
||||
# Exclude 'phone_numbers' as Customer model uses 'phone' field
|
||||
customer = Customer(
|
||||
**validated_data.model_dump(exclude_none=True, exclude={"phone_numbers"})
|
||||
)
|
||||
|
||||
# Set fields not in CustomerData model separately
|
||||
if "contact_id" in customer_data:
|
||||
customer.contact_id = customer_data["contact_id"]
|
||||
if "phone" in customer_data:
|
||||
customer.phone = customer_data["phone"]
|
||||
|
||||
self.session.add(customer)
|
||||
await self.session.flush() # Flush to get the customer.id
|
||||
|
||||
# Create hashed version
|
||||
hashed_customer = customer.create_hashed_customer()
|
||||
hashed_customer.created_at = datetime.now(UTC)
|
||||
self.session.add(hashed_customer)
|
||||
|
||||
await self.session.commit()
|
||||
await self.session.refresh(customer)
|
||||
|
||||
return customer
|
||||
|
||||
async def update_customer(self, customer: Customer, update_data: dict) -> Customer:
|
||||
"""Update an existing customer and sync its hashed version.
|
||||
|
||||
Args:
|
||||
customer: The customer to update
|
||||
update_data: Dictionary of fields to update
|
||||
|
||||
Returns:
|
||||
The updated Customer instance
|
||||
|
||||
Raises:
|
||||
ValidationError: If update_data fails validation
|
||||
(e.g., invalid country code)
|
||||
|
||||
"""
|
||||
# Validate update data through Pydantic model
|
||||
# We need to merge with existing data for validation
|
||||
existing_data = {
|
||||
"given_name": customer.given_name,
|
||||
"surname": customer.surname,
|
||||
"name_prefix": customer.name_prefix,
|
||||
"email_address": customer.email_address,
|
||||
"phone": customer.phone,
|
||||
"email_newsletter": customer.email_newsletter,
|
||||
"address_line": customer.address_line,
|
||||
"city_name": customer.city_name,
|
||||
"postal_code": customer.postal_code,
|
||||
"country_code": customer.country_code,
|
||||
"gender": customer.gender,
|
||||
"birth_date": customer.birth_date,
|
||||
"language": customer.language,
|
||||
"address_catalog": customer.address_catalog,
|
||||
"name_title": customer.name_title,
|
||||
}
|
||||
# Merge update_data into existing_data (only CustomerData fields)
|
||||
# Filter to include only fields that exist in CustomerData model
|
||||
customer_data_fields = set(CustomerData.model_fields.keys())
|
||||
# Include 'phone' field (maps to CustomerData)
|
||||
existing_data.update(
|
||||
{
|
||||
k: v
|
||||
for k, v in update_data.items()
|
||||
if k in customer_data_fields or k == "phone"
|
||||
}
|
||||
)
|
||||
|
||||
# Validate merged data
|
||||
validated_data = CustomerData(**existing_data)
|
||||
|
||||
# Update customer fields with validated data
|
||||
# Exclude 'phone_numbers' as Customer model uses 'phone' field
|
||||
# Note: We don't use exclude_none=True to allow setting fields to None
|
||||
for key, value in validated_data.model_dump(exclude={"phone_numbers"}).items():
|
||||
if hasattr(customer, key):
|
||||
setattr(customer, key, value)
|
||||
|
||||
# Update fields not in CustomerData model separately
|
||||
if "contact_id" in update_data:
|
||||
customer.contact_id = update_data["contact_id"]
|
||||
if "phone" in update_data:
|
||||
customer.phone = update_data["phone"]
|
||||
|
||||
# Update or create hashed version
|
||||
result = await self.session.execute(
|
||||
select(HashedCustomer).where(HashedCustomer.customer_id == customer.id)
|
||||
)
|
||||
hashed_customer = result.scalar_one_or_none()
|
||||
|
||||
if hashed_customer:
|
||||
# Update existing hashed customer
|
||||
new_hashed = customer.create_hashed_customer()
|
||||
hashed_customer.hashed_email = new_hashed.hashed_email
|
||||
hashed_customer.hashed_phone = new_hashed.hashed_phone
|
||||
hashed_customer.hashed_given_name = new_hashed.hashed_given_name
|
||||
hashed_customer.hashed_surname = new_hashed.hashed_surname
|
||||
hashed_customer.hashed_city = new_hashed.hashed_city
|
||||
hashed_customer.hashed_postal_code = new_hashed.hashed_postal_code
|
||||
hashed_customer.hashed_country_code = new_hashed.hashed_country_code
|
||||
hashed_customer.hashed_gender = new_hashed.hashed_gender
|
||||
hashed_customer.hashed_birth_date = new_hashed.hashed_birth_date
|
||||
else:
|
||||
# Create new hashed customer if it doesn't exist
|
||||
hashed_customer = customer.create_hashed_customer()
|
||||
hashed_customer.created_at = datetime.now(UTC)
|
||||
self.session.add(hashed_customer)
|
||||
|
||||
await self.session.commit()
|
||||
await self.session.refresh(customer)
|
||||
|
||||
return customer
|
||||
|
||||
async def get_customer_by_contact_id(self, contact_id: str) -> Customer | None:
|
||||
"""Get a customer by contact_id.
|
||||
|
||||
Args:
|
||||
contact_id: The contact_id to search for
|
||||
|
||||
Returns:
|
||||
Customer instance if found, None otherwise
|
||||
|
||||
"""
|
||||
result = await self.session.execute(
|
||||
select(Customer).where(Customer.contact_id == contact_id)
|
||||
)
|
||||
return result.scalar_one_or_none()
|
||||
|
||||
async def get_or_create_customer(self, customer_data: dict) -> Customer:
|
||||
"""Get existing customer or create new one if not found.
|
||||
|
||||
Uses contact_id to identify existing customers if provided.
|
||||
|
||||
Args:
|
||||
customer_data: Dictionary containing customer fields
|
||||
(contact_id is optional)
|
||||
|
||||
Returns:
|
||||
Existing or newly created Customer instance
|
||||
|
||||
"""
|
||||
contact_id = customer_data.get("contact_id")
|
||||
|
||||
if contact_id:
|
||||
existing = await self.get_customer_by_contact_id(contact_id)
|
||||
if existing:
|
||||
# Update existing customer
|
||||
return await self.update_customer(existing, customer_data)
|
||||
|
||||
# Create new customer (either no contact_id or customer doesn't exist)
|
||||
return await self.create_customer(customer_data)
|
||||
|
||||
async def get_hashed_customer(self, customer_id: int) -> HashedCustomer | None:
|
||||
"""Get the hashed version of a customer.
|
||||
|
||||
Args:
|
||||
customer_id: The customer ID
|
||||
|
||||
Returns:
|
||||
HashedCustomer instance if found, None otherwise
|
||||
|
||||
"""
|
||||
result = await self.session.execute(
|
||||
select(HashedCustomer).where(HashedCustomer.customer_id == customer_id)
|
||||
)
|
||||
return result.scalar_one_or_none()
|
||||
|
||||
async def hash_existing_customers(self) -> int:
|
||||
"""Hash all existing customers that don't have a hashed version yet.
|
||||
|
||||
This is useful for backfilling hashed data for customers created
|
||||
before the hashing system was implemented.
|
||||
|
||||
Also validates and sanitizes customer data (e.g., normalizes country
|
||||
codes to uppercase). Customers with invalid data that cannot be fixed
|
||||
will be skipped and logged.
|
||||
|
||||
Returns:
|
||||
Number of customers that were hashed
|
||||
|
||||
"""
|
||||
# Get all customers
|
||||
result = await self.session.execute(select(Customer))
|
||||
customers = result.scalars().all()
|
||||
|
||||
hashed_count = 0
|
||||
skipped_count = 0
|
||||
|
||||
for customer in customers:
|
||||
# Check if this customer already has a hashed version
|
||||
existing_hashed = await self.get_hashed_customer(customer.id)
|
||||
if not existing_hashed:
|
||||
# Validate and sanitize customer data before hashing
|
||||
customer_dict = {
|
||||
"given_name": customer.given_name,
|
||||
"surname": customer.surname,
|
||||
"name_prefix": customer.name_prefix,
|
||||
"email_address": customer.email_address,
|
||||
"phone": customer.phone,
|
||||
"email_newsletter": customer.email_newsletter,
|
||||
"address_line": customer.address_line,
|
||||
"city_name": customer.city_name,
|
||||
"postal_code": customer.postal_code,
|
||||
"country_code": customer.country_code,
|
||||
"gender": customer.gender,
|
||||
"birth_date": customer.birth_date,
|
||||
"language": customer.language,
|
||||
"address_catalog": customer.address_catalog,
|
||||
"name_title": customer.name_title,
|
||||
}
|
||||
|
||||
try:
|
||||
# Validate through Pydantic (normalizes country code)
|
||||
validated = CustomerData(**customer_dict)
|
||||
|
||||
# Update customer with sanitized data
|
||||
# Exclude 'phone_numbers' as Customer model uses 'phone' field
|
||||
for key, value in validated.model_dump(
|
||||
exclude_none=True, exclude={"phone_numbers"}
|
||||
).items():
|
||||
if hasattr(customer, key):
|
||||
setattr(customer, key, value)
|
||||
|
||||
# Create hashed version with sanitized data
|
||||
hashed_customer = customer.create_hashed_customer()
|
||||
hashed_customer.created_at = datetime.now(UTC)
|
||||
self.session.add(hashed_customer)
|
||||
hashed_count += 1
|
||||
|
||||
except ValidationError as e:
|
||||
# Skip customers with invalid data and log
|
||||
skipped_count += 1
|
||||
_LOGGER.warning(
|
||||
"Skipping customer ID %s due to validation error: %s",
|
||||
customer.id,
|
||||
e,
|
||||
)
|
||||
|
||||
if hashed_count > 0:
|
||||
await self.session.commit()
|
||||
|
||||
if skipped_count > 0:
|
||||
_LOGGER.warning(
|
||||
"Skipped %d customers with invalid data. "
|
||||
"Please fix these customers manually.",
|
||||
skipped_count,
|
||||
)
|
||||
|
||||
return hashed_count
|
||||
@@ -1,10 +1,27 @@
|
||||
import asyncio
|
||||
import hashlib
|
||||
import os
|
||||
from typing import Any, AsyncGenerator, Callable, TypeVar
|
||||
|
||||
from sqlalchemy import Boolean, Column, Date, DateTime, ForeignKey, Integer, String
|
||||
from sqlalchemy import Boolean, Column, Date, DateTime, ForeignKey, Integer, String, JSON
|
||||
from sqlalchemy.exc import DBAPIError
|
||||
from sqlalchemy.ext.asyncio import AsyncEngine, AsyncSession, create_async_engine, async_sessionmaker
|
||||
from sqlalchemy.orm import declarative_base, relationship
|
||||
|
||||
from .logging_config import get_logger
|
||||
|
||||
_LOGGER = get_logger(__name__)
|
||||
|
||||
Base = declarative_base()
|
||||
|
||||
# Type variable for async functions
|
||||
T = TypeVar("T")
|
||||
|
||||
# Maximum number of retries for session operations
|
||||
MAX_RETRIES = 3
|
||||
# Delay between retries in seconds
|
||||
RETRY_DELAY = 0.5
|
||||
|
||||
|
||||
# Async SQLAlchemy setup
|
||||
def get_database_url(config=None):
|
||||
@@ -18,6 +35,203 @@ def get_database_url(config=None):
|
||||
return db_url
|
||||
|
||||
|
||||
def get_database_schema(config=None):
|
||||
"""Get the PostgreSQL schema name from config.
|
||||
|
||||
Args:
|
||||
config: Configuration dictionary
|
||||
|
||||
Returns:
|
||||
Schema name string, or None if not configured
|
||||
|
||||
"""
|
||||
if config and "database" in config and "schema" in config["database"]:
|
||||
return config["database"]["schema"]
|
||||
return os.environ.get("DATABASE_SCHEMA")
|
||||
|
||||
|
||||
def configure_schema(schema_name=None):
|
||||
"""Configure the database schema for all models.
|
||||
|
||||
This should be called before creating tables or running migrations.
|
||||
For PostgreSQL, this sets the schema for all tables.
|
||||
For other databases, this is a no-op.
|
||||
|
||||
Args:
|
||||
schema_name: Name of the schema to use (e.g., "alpinebits")
|
||||
|
||||
"""
|
||||
if schema_name:
|
||||
# Update the schema for all tables in Base metadata
|
||||
for table in Base.metadata.tables.values():
|
||||
table.schema = schema_name
|
||||
|
||||
|
||||
def create_database_engine(config=None, echo=False) -> AsyncEngine:
|
||||
"""Create a configured database engine with schema support.
|
||||
|
||||
This function:
|
||||
1. Gets the database URL from config
|
||||
2. Gets the schema name (if configured)
|
||||
3. Configures all models to use the schema
|
||||
4. Creates the async engine with appropriate connect_args for PostgreSQL
|
||||
|
||||
Args:
|
||||
config: Configuration dictionary
|
||||
echo: Whether to echo SQL statements (default: False)
|
||||
|
||||
Returns:
|
||||
Configured AsyncEngine instance
|
||||
|
||||
"""
|
||||
database_url = get_database_url(config)
|
||||
schema_name = get_database_schema(config)
|
||||
|
||||
# Configure schema for all models if specified
|
||||
if schema_name:
|
||||
configure_schema(schema_name)
|
||||
_LOGGER.info("Configured database schema: %s", schema_name)
|
||||
|
||||
# Create engine with connect_args to set search_path for PostgreSQL
|
||||
connect_args = {}
|
||||
if schema_name and "postgresql" in database_url:
|
||||
connect_args = {
|
||||
"server_settings": {"search_path": f"{schema_name},public"}
|
||||
}
|
||||
_LOGGER.info("Setting PostgreSQL search_path to: %s,public", schema_name)
|
||||
|
||||
return create_async_engine(database_url, echo=echo, connect_args=connect_args)
|
||||
|
||||
|
||||
class ResilientAsyncSession:
|
||||
"""Wrapper around AsyncSession that handles connection recovery.
|
||||
|
||||
This wrapper automatically retries operations on connection loss or OID errors,
|
||||
disposing the connection pool and creating a fresh session on failure.
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
async_sessionmaker_: async_sessionmaker[AsyncSession],
|
||||
engine: AsyncEngine,
|
||||
):
|
||||
"""Initialize the resilient session wrapper.
|
||||
|
||||
Args:
|
||||
async_sessionmaker_: Factory for creating async sessions
|
||||
engine: The SQLAlchemy async engine for connection recovery
|
||||
"""
|
||||
self.async_sessionmaker = async_sessionmaker_
|
||||
self.engine = engine
|
||||
|
||||
async def execute_with_retry(
|
||||
self, func: Callable[..., T], *args, **kwargs
|
||||
) -> T:
|
||||
"""Execute a function with automatic retry on connection errors.
|
||||
|
||||
Args:
|
||||
func: Async function that takes a session as first argument
|
||||
*args: Positional arguments to pass to func (first arg should be session)
|
||||
**kwargs: Keyword arguments to pass to func
|
||||
|
||||
Returns:
|
||||
Result of the function call
|
||||
|
||||
Raises:
|
||||
The original exception if all retries are exhausted
|
||||
"""
|
||||
last_error = None
|
||||
|
||||
for attempt in range(MAX_RETRIES):
|
||||
try:
|
||||
async with self.async_sessionmaker() as session:
|
||||
return await func(session, *args, **kwargs)
|
||||
except DBAPIError as e:
|
||||
last_error = e
|
||||
error_msg = str(e).lower()
|
||||
|
||||
# Check if this is an OID error or connection loss
|
||||
if (
|
||||
"could not open relation" in error_msg
|
||||
or "lost connection" in error_msg
|
||||
or "connection closed" in error_msg
|
||||
or "connection refused" in error_msg
|
||||
):
|
||||
_LOGGER.warning(
|
||||
"Connection error on attempt %d/%d: %s. Disposing pool and retrying...",
|
||||
attempt + 1,
|
||||
MAX_RETRIES,
|
||||
e.__class__.__name__,
|
||||
)
|
||||
|
||||
# Dispose the entire connection pool to force new connections
|
||||
await self.engine.dispose()
|
||||
|
||||
# Wait before retry (exponential backoff)
|
||||
if attempt < MAX_RETRIES - 1:
|
||||
wait_time = RETRY_DELAY * (2 ** attempt)
|
||||
await asyncio.sleep(wait_time)
|
||||
else:
|
||||
# Not a connection-related error, re-raise immediately
|
||||
raise
|
||||
except Exception:
|
||||
# Any other exception, re-raise immediately
|
||||
raise
|
||||
|
||||
# All retries exhausted
|
||||
_LOGGER.error(
|
||||
"Failed to execute query after %d retries: %s",
|
||||
MAX_RETRIES,
|
||||
last_error.__class__.__name__,
|
||||
)
|
||||
raise last_error
|
||||
|
||||
|
||||
class SessionMaker:
|
||||
"""Factory for creating independent AsyncSession instances.
|
||||
|
||||
This class enables concurrent processing by allowing each task to create
|
||||
and manage its own database session. Useful for processing large datasets
|
||||
where concurrent execution is desired but each concurrent task needs its own
|
||||
database transaction context.
|
||||
"""
|
||||
|
||||
def __init__(self, async_sessionmaker_: async_sessionmaker[AsyncSession]):
|
||||
"""Initialize the SessionMaker.
|
||||
|
||||
Args:
|
||||
async_sessionmaker_: SQLAlchemy async_sessionmaker factory
|
||||
"""
|
||||
self.async_sessionmaker = async_sessionmaker_
|
||||
|
||||
async def create_session(self) -> AsyncSession:
|
||||
"""Create a new independent AsyncSession.
|
||||
|
||||
Returns:
|
||||
A new AsyncSession instance ready for use. Caller is responsible
|
||||
for managing the session lifecycle (closing when done).
|
||||
"""
|
||||
return self.async_sessionmaker()
|
||||
|
||||
|
||||
async def get_resilient_session(
|
||||
resilient_session: "ResilientAsyncSession",
|
||||
) -> AsyncGenerator[AsyncSession, None]:
|
||||
"""Dependency for FastAPI that provides a resilient async session.
|
||||
|
||||
This generator creates a new session with automatic retry capability
|
||||
on connection errors. Used as a dependency in FastAPI endpoints.
|
||||
|
||||
Args:
|
||||
resilient_session: ResilientAsyncSession instance from app state
|
||||
|
||||
Yields:
|
||||
AsyncSession instance for database operations
|
||||
"""
|
||||
async with resilient_session.async_sessionmaker() as session:
|
||||
yield session
|
||||
|
||||
|
||||
class Customer(Base):
|
||||
__tablename__ = "customers"
|
||||
id = Column(Integer, primary_key=True)
|
||||
@@ -39,6 +253,68 @@ class Customer(Base):
|
||||
name_title = Column(String) # Added for XML
|
||||
reservations = relationship("Reservation", back_populates="customer")
|
||||
|
||||
@staticmethod
|
||||
def _normalize_and_hash(value):
|
||||
"""Normalize and hash a value according to Meta Conversion API requirements."""
|
||||
if not value:
|
||||
return None
|
||||
# Normalize: lowercase, strip whitespace
|
||||
normalized = str(value).lower().strip()
|
||||
# Remove spaces for phone numbers
|
||||
is_phone = (
|
||||
normalized.startswith("+")
|
||||
or normalized.replace("-", "").replace(" ", "").isdigit()
|
||||
)
|
||||
if is_phone:
|
||||
chars_to_remove = [" ", "-", "(", ")"]
|
||||
for char in chars_to_remove:
|
||||
normalized = normalized.replace(char, "")
|
||||
# SHA256 hash
|
||||
return hashlib.sha256(normalized.encode("utf-8")).hexdigest()
|
||||
|
||||
def create_hashed_customer(self):
|
||||
"""Create a HashedCustomer instance from this Customer."""
|
||||
return HashedCustomer(
|
||||
customer_id=self.id,
|
||||
contact_id=self.contact_id,
|
||||
hashed_email=self._normalize_and_hash(self.email_address),
|
||||
hashed_phone=self._normalize_and_hash(self.phone),
|
||||
hashed_given_name=self._normalize_and_hash(self.given_name),
|
||||
hashed_surname=self._normalize_and_hash(self.surname),
|
||||
hashed_city=self._normalize_and_hash(self.city_name),
|
||||
hashed_postal_code=self._normalize_and_hash(self.postal_code),
|
||||
hashed_country_code=self._normalize_and_hash(self.country_code),
|
||||
hashed_gender=self._normalize_and_hash(self.gender),
|
||||
hashed_birth_date=self._normalize_and_hash(self.birth_date),
|
||||
)
|
||||
|
||||
|
||||
class HashedCustomer(Base):
|
||||
"""Hashed customer data for Meta Conversion API.
|
||||
|
||||
Stores SHA256 hashed versions of customer PII according to Meta's requirements.
|
||||
This allows sending conversion events without exposing raw customer data.
|
||||
"""
|
||||
|
||||
__tablename__ = "hashed_customers"
|
||||
id = Column(Integer, primary_key=True)
|
||||
customer_id = Column(
|
||||
Integer, ForeignKey("customers.id"), unique=True, nullable=False
|
||||
)
|
||||
contact_id = Column(String, unique=True) # Keep unhashed for reference
|
||||
hashed_email = Column(String(64)) # SHA256 produces 64 hex chars
|
||||
hashed_phone = Column(String(64))
|
||||
hashed_given_name = Column(String(64))
|
||||
hashed_surname = Column(String(64))
|
||||
hashed_city = Column(String(64))
|
||||
hashed_postal_code = Column(String(64))
|
||||
hashed_country_code = Column(String(64))
|
||||
hashed_gender = Column(String(64))
|
||||
hashed_birth_date = Column(String(64))
|
||||
created_at = Column(DateTime(timezone=True))
|
||||
|
||||
customer = relationship("Customer", backref="hashed_version")
|
||||
|
||||
|
||||
class Reservation(Base):
|
||||
__tablename__ = "reservations"
|
||||
@@ -52,7 +328,7 @@ class Reservation(Base):
|
||||
num_children = Column(Integer)
|
||||
children_ages = Column(String) # comma-separated
|
||||
offer = Column(String)
|
||||
created_at = Column(DateTime)
|
||||
created_at = Column(DateTime(timezone=True))
|
||||
# Add all UTM fields and user comment for XML
|
||||
utm_source = Column(String)
|
||||
utm_medium = Column(String)
|
||||
@@ -62,9 +338,16 @@ class Reservation(Base):
|
||||
user_comment = Column(String)
|
||||
fbclid = Column(String)
|
||||
gclid = Column(String)
|
||||
# Advertising account IDs (stored conditionally based on fbclid/gclid presence)
|
||||
meta_account_id = Column(String)
|
||||
google_account_id = Column(String)
|
||||
# Add hotel_code and hotel_name for XML
|
||||
hotel_code = Column(String)
|
||||
hotel_name = Column(String)
|
||||
# RoomTypes fields (optional)
|
||||
room_type_code = Column(String)
|
||||
room_classification_code = Column(String)
|
||||
room_type = Column(String)
|
||||
customer = relationship("Customer", back_populates="reservations")
|
||||
|
||||
|
||||
@@ -73,7 +356,123 @@ class AckedRequest(Base):
|
||||
__tablename__ = "acked_requests"
|
||||
id = Column(Integer, primary_key=True)
|
||||
client_id = Column(String, index=True)
|
||||
username = Column(String, index=True, nullable=True) # Username of the client making the request
|
||||
unique_id = Column(
|
||||
String, index=True
|
||||
) # Should match Reservation.form_id or another unique field
|
||||
timestamp = Column(DateTime)
|
||||
timestamp = Column(DateTime(timezone=True))
|
||||
|
||||
|
||||
class Conversion(Base):
|
||||
"""Conversion data from hotel PMS.
|
||||
|
||||
Represents a single reservation event from the PMS XML with all its metadata.
|
||||
Each row links to one reservation from the PMS system. A reservation can have
|
||||
multiple room reservations (stored in RoomReservation table).
|
||||
|
||||
Linked to reservations via advertising tracking data (fbclid, gclid, etc)
|
||||
stored in advertisingCampagne field.
|
||||
"""
|
||||
|
||||
__tablename__ = "conversions"
|
||||
id = Column(Integer, primary_key=True)
|
||||
|
||||
# Link to reservation (nullable since matching may not always work)
|
||||
reservation_id = Column(
|
||||
Integer, ForeignKey("reservations.id"), nullable=True, index=True
|
||||
)
|
||||
customer_id = Column(Integer, ForeignKey("customers.id"), nullable=True, index=True)
|
||||
hashed_customer_id = Column(
|
||||
Integer, ForeignKey("hashed_customers.id"), nullable=True, index=True
|
||||
)
|
||||
|
||||
# Reservation metadata from XML
|
||||
hotel_id = Column(String, index=True) # hotelID attribute
|
||||
pms_reservation_id = Column(String, index=True) # id attribute from reservation
|
||||
reservation_number = Column(String) # number attribute
|
||||
reservation_date = Column(Date) # date attribute (when reservation was made)
|
||||
creation_time = Column(DateTime(timezone=True)) # creationTime attribute
|
||||
reservation_type = Column(String) # type attribute (e.g., "reservation")
|
||||
booking_channel = Column(String) # bookingChannel attribute
|
||||
|
||||
# Guest information from reservation XML - used for matching
|
||||
guest_first_name = Column(String, index=True) # firstName from guest element
|
||||
guest_last_name = Column(String, index=True) # lastName from guest element
|
||||
guest_email = Column(String, index=True) # email from guest element
|
||||
guest_country_code = Column(String) # countryCode from guest element
|
||||
|
||||
# Advertising/tracking data - used for matching to existing reservations
|
||||
advertising_medium = Column(
|
||||
String, index=True
|
||||
) # advertisingMedium (e.g., "99TALES")
|
||||
advertising_partner = Column(
|
||||
String, index=True
|
||||
) # advertisingPartner (e.g., "cpc", "website")
|
||||
advertising_campagne = Column(
|
||||
String, index=True
|
||||
) # advertisingCampagne (contains fbclid/gclid)
|
||||
|
||||
# Metadata
|
||||
created_at = Column(DateTime(timezone=True)) # When this record was imported
|
||||
updated_at = Column(DateTime(timezone=True)) # When this record was last updated
|
||||
|
||||
# Relationships
|
||||
reservation = relationship("Reservation", backref="conversions")
|
||||
customer = relationship("Customer", backref="conversions")
|
||||
hashed_customer = relationship("HashedCustomer", backref="conversions")
|
||||
room_reservations = relationship(
|
||||
"RoomReservation", back_populates="conversion", cascade="all, delete-orphan"
|
||||
)
|
||||
|
||||
|
||||
class RoomReservation(Base):
|
||||
"""Room reservation data from hotel PMS.
|
||||
|
||||
Represents a single room reservation within a conversion/PMS reservation.
|
||||
One conversion can have multiple room reservations (e.g., customer books 3 rooms).
|
||||
|
||||
Daily sales are stored as a JSON blob with an extracted total_revenue field
|
||||
for efficient querying.
|
||||
"""
|
||||
|
||||
__tablename__ = "room_reservations"
|
||||
id = Column(Integer, primary_key=True)
|
||||
|
||||
# Link to the parent conversion/PMS reservation
|
||||
conversion_id = Column(
|
||||
Integer, ForeignKey("conversions.id"), nullable=False, index=True
|
||||
)
|
||||
|
||||
# Identifier for this room reservation (for upserts)
|
||||
# Composite: pms_reservation_id + room_number
|
||||
# Note: Not globally unique - same room number can exist across different hotels
|
||||
pms_hotel_reservation_id = Column(String, index=True)
|
||||
|
||||
# Room reservation details
|
||||
arrival_date = Column(Date, index=True) # arrival attribute
|
||||
departure_date = Column(Date, index=True) # departure attribute
|
||||
room_status = Column(String) # status attribute (e.g., "reserved", "departed")
|
||||
room_type = Column(String) # roomType attribute (e.g., "VDS", "EZR")
|
||||
room_number = Column(String, index=True) # roomNumber attribute
|
||||
num_adults = Column(Integer) # adults attribute
|
||||
rate_plan_code = Column(String) # ratePlanCode attribute
|
||||
connected_room_type = Column(String) # connectedRoomType attribute
|
||||
|
||||
# Daily sales data stored as JSON
|
||||
# Format: [
|
||||
# {"date": "2021-10-09", "revenueTotal": "13.6", "revenueOther": "13.6"},
|
||||
# {"date": "2021-10-10", "revenueTotal": "306.1", "revenueLogis": "254", ...},
|
||||
# ...
|
||||
# ]
|
||||
daily_sales = Column(JSON, nullable=True) # JSON array of daily sales
|
||||
|
||||
# Extracted total revenue for efficient querying (sum of all revenue_total in daily_sales)
|
||||
# Kept as string to preserve decimal precision
|
||||
total_revenue = Column(String, nullable=True)
|
||||
|
||||
# Metadata
|
||||
created_at = Column(DateTime(timezone=True)) # When this record was imported
|
||||
updated_at = Column(DateTime(timezone=True)) # When this record was last updated
|
||||
|
||||
# Relationships
|
||||
conversion = relationship("Conversion", back_populates="room_reservations")
|
||||
|
||||
571
src/alpine_bits_python/email_monitoring.py
Normal file
571
src/alpine_bits_python/email_monitoring.py
Normal file
@@ -0,0 +1,571 @@
|
||||
"""Email monitoring and alerting through logging integration.
|
||||
|
||||
This module provides a custom logging handler that accumulates errors and sends
|
||||
email alerts based on configurable thresholds and time windows.
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
import logging
|
||||
import threading
|
||||
from collections import defaultdict, deque
|
||||
from datetime import datetime, timedelta
|
||||
from typing import Any
|
||||
|
||||
from sqlalchemy import func, select
|
||||
from sqlalchemy.ext.asyncio import async_sessionmaker
|
||||
|
||||
from .db import Reservation
|
||||
from .email_service import EmailService
|
||||
from .logging_config import get_logger
|
||||
|
||||
_LOGGER = get_logger(__name__)
|
||||
|
||||
|
||||
class ErrorRecord:
|
||||
"""Represents a single error log record for monitoring.
|
||||
|
||||
Attributes:
|
||||
timestamp: When the error occurred
|
||||
level: Log level (ERROR, CRITICAL, etc.)
|
||||
logger_name: Name of the logger that generated the error
|
||||
message: The error message
|
||||
exception: Exception info if available
|
||||
module: Module where error occurred
|
||||
line_no: Line number where error occurred
|
||||
|
||||
"""
|
||||
|
||||
def __init__(self, record: logging.LogRecord):
|
||||
"""Initialize from a logging.LogRecord.
|
||||
|
||||
Args:
|
||||
record: The logging record to wrap
|
||||
|
||||
"""
|
||||
self.timestamp = datetime.fromtimestamp(record.created)
|
||||
self.level = record.levelname
|
||||
self.logger_name = record.name
|
||||
self.message = record.getMessage()
|
||||
self.exception = record.exc_text if record.exc_info else None
|
||||
self.module = record.module
|
||||
self.line_no = record.lineno
|
||||
self.pathname = record.pathname
|
||||
|
||||
def to_dict(self) -> dict[str, Any]:
|
||||
"""Convert to dictionary format.
|
||||
|
||||
Returns:
|
||||
Dictionary representation of the error
|
||||
|
||||
"""
|
||||
return {
|
||||
"timestamp": self.timestamp.strftime("%Y-%m-%d %H:%M:%S"),
|
||||
"level": self.level,
|
||||
"logger_name": self.logger_name,
|
||||
"message": self.message,
|
||||
"exception": self.exception,
|
||||
"module": self.module,
|
||||
"line_no": self.line_no,
|
||||
"pathname": self.pathname,
|
||||
}
|
||||
|
||||
def format_plain_text(self) -> str:
|
||||
"""Format error as plain text for email.
|
||||
|
||||
Returns:
|
||||
Formatted plain text string
|
||||
|
||||
"""
|
||||
text = f"[{self.timestamp.strftime('%Y-%m-%d %H:%M:%S')}] {self.level}: {self.message}\n"
|
||||
text += f" Module: {self.module}:{self.line_no} ({self.logger_name})\n"
|
||||
if self.exception:
|
||||
text += f" Exception:\n{self.exception}\n"
|
||||
return text
|
||||
|
||||
|
||||
class EmailAlertHandler(logging.Handler):
|
||||
"""Custom logging handler that sends email alerts for errors.
|
||||
|
||||
This handler uses a hybrid approach:
|
||||
- Accumulates errors in a buffer
|
||||
- Sends immediately if error threshold is reached
|
||||
- Otherwise sends after buffer duration expires
|
||||
- Always sends buffered errors (no minimum threshold for time-based flush)
|
||||
- Implements cooldown to prevent alert spam
|
||||
|
||||
The handler is thread-safe and works with asyncio event loops.
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
email_service: EmailService,
|
||||
config: dict[str, Any],
|
||||
loop: asyncio.AbstractEventLoop | None = None,
|
||||
):
|
||||
"""Initialize the email alert handler.
|
||||
|
||||
Args:
|
||||
email_service: Email service instance for sending alerts
|
||||
config: Configuration dictionary for error alerts
|
||||
loop: Asyncio event loop (will use current loop if not provided)
|
||||
|
||||
"""
|
||||
super().__init__()
|
||||
self.email_service = email_service
|
||||
self.config = config
|
||||
self.loop = loop # Will be set when first error occurs if not provided
|
||||
|
||||
# Configuration
|
||||
self.recipients = config.get("recipients", [])
|
||||
self.error_threshold = config.get("error_threshold", 5)
|
||||
self.buffer_minutes = config.get("buffer_minutes", 15)
|
||||
self.cooldown_minutes = config.get("cooldown_minutes", 15)
|
||||
self.log_levels = config.get("log_levels", ["ERROR", "CRITICAL"])
|
||||
|
||||
# State
|
||||
self.error_buffer: deque[ErrorRecord] = deque()
|
||||
self.last_sent = datetime.min # Last time we sent an alert
|
||||
self._flush_task: asyncio.Task | None = None
|
||||
self._lock = threading.Lock() # Thread-safe for multi-threaded logging
|
||||
|
||||
_LOGGER.info(
|
||||
"EmailAlertHandler initialized: threshold=%d, buffer=%dmin, cooldown=%dmin",
|
||||
self.error_threshold,
|
||||
self.buffer_minutes,
|
||||
self.cooldown_minutes,
|
||||
)
|
||||
|
||||
def emit(self, record: logging.LogRecord) -> None:
|
||||
"""Handle a log record.
|
||||
|
||||
This is called automatically by the logging system when an error is logged.
|
||||
It's important that this method is fast and doesn't block.
|
||||
|
||||
Args:
|
||||
record: The log record to handle
|
||||
|
||||
"""
|
||||
# Only handle configured log levels
|
||||
if record.levelname not in self.log_levels:
|
||||
return
|
||||
|
||||
try:
|
||||
# Ensure we have an event loop
|
||||
if self.loop is None:
|
||||
try:
|
||||
self.loop = asyncio.get_running_loop()
|
||||
except RuntimeError:
|
||||
# No running loop, we'll need to handle this differently
|
||||
_LOGGER.warning("No asyncio event loop available for email alerts")
|
||||
return
|
||||
|
||||
# Add error to buffer (thread-safe)
|
||||
with self._lock:
|
||||
error_record = ErrorRecord(record)
|
||||
self.error_buffer.append(error_record)
|
||||
buffer_size = len(self.error_buffer)
|
||||
|
||||
# Determine if we should send immediately
|
||||
should_send_immediately = buffer_size >= self.error_threshold
|
||||
|
||||
if should_send_immediately:
|
||||
# Cancel any pending flush task
|
||||
if self._flush_task and not self._flush_task.done():
|
||||
self._flush_task.cancel()
|
||||
|
||||
# Schedule immediate flush
|
||||
self._flush_task = asyncio.run_coroutine_threadsafe(
|
||||
self._flush_buffer(immediate=True),
|
||||
self.loop,
|
||||
)
|
||||
# Schedule delayed flush if not already scheduled
|
||||
elif not self._flush_task or self._flush_task.done():
|
||||
self._flush_task = asyncio.run_coroutine_threadsafe(
|
||||
self._schedule_delayed_flush(),
|
||||
self.loop,
|
||||
)
|
||||
|
||||
except Exception:
|
||||
# Never let the handler crash - just log and continue
|
||||
_LOGGER.exception("Error in EmailAlertHandler.emit")
|
||||
|
||||
async def _schedule_delayed_flush(self) -> None:
|
||||
"""Schedule a delayed buffer flush after buffer duration."""
|
||||
await asyncio.sleep(self.buffer_minutes * 60)
|
||||
await self._flush_buffer(immediate=False)
|
||||
|
||||
async def _flush_buffer(self, *, immediate: bool) -> None:
|
||||
"""Flush the error buffer and send email alert.
|
||||
|
||||
Args:
|
||||
immediate: Whether this is an immediate flush (threshold hit)
|
||||
|
||||
"""
|
||||
# Check cooldown period
|
||||
now = datetime.now()
|
||||
time_since_last = (now - self.last_sent).total_seconds() / 60
|
||||
|
||||
if time_since_last < self.cooldown_minutes:
|
||||
_LOGGER.info(
|
||||
"Alert cooldown active (%.1f min remaining), buffering errors",
|
||||
self.cooldown_minutes - time_since_last,
|
||||
)
|
||||
# Don't clear buffer - let errors accumulate until cooldown expires
|
||||
return
|
||||
|
||||
# Get all buffered errors (thread-safe)
|
||||
with self._lock:
|
||||
if not self.error_buffer:
|
||||
return
|
||||
|
||||
errors = list(self.error_buffer)
|
||||
self.error_buffer.clear()
|
||||
|
||||
# Update last sent time
|
||||
self.last_sent = now
|
||||
|
||||
# Format email
|
||||
error_count = len(errors)
|
||||
time_range = (
|
||||
f"{errors[0].timestamp.strftime('%H:%M:%S')} to "
|
||||
f"{errors[-1].timestamp.strftime('%H:%M:%S')}"
|
||||
)
|
||||
|
||||
# Determine alert type for subject
|
||||
alert_type = "Immediate Alert" if immediate else "Scheduled Alert"
|
||||
if immediate:
|
||||
emoji = "🚨"
|
||||
reason = f"(threshold of {self.error_threshold} exceeded)"
|
||||
else:
|
||||
emoji = "⚠️"
|
||||
reason = f"({self.buffer_minutes} minute buffer)"
|
||||
|
||||
subject = (
|
||||
f"{emoji} AlpineBits Error {alert_type}: {error_count} errors {reason}"
|
||||
)
|
||||
|
||||
# Build plain text body
|
||||
body = f"Error Alert - {datetime.now().strftime('%Y-%m-%d %H:%M:%S')}\n"
|
||||
body += "=" * 70 + "\n\n"
|
||||
body += f"Alert Type: {alert_type}\n"
|
||||
body += f"Error Count: {error_count}\n"
|
||||
body += f"Time Range: {time_range}\n"
|
||||
body += f"Reason: {reason}\n"
|
||||
body += "\n" + "=" * 70 + "\n\n"
|
||||
|
||||
# Add individual errors
|
||||
body += "Errors:\n"
|
||||
body += "-" * 70 + "\n\n"
|
||||
for error in errors:
|
||||
body += error.format_plain_text()
|
||||
body += "\n"
|
||||
|
||||
body += "-" * 70 + "\n"
|
||||
body += f"Generated by AlpineBits Email Monitoring at {now.strftime('%Y-%m-%d %H:%M:%S')}\n"
|
||||
|
||||
# Send email
|
||||
try:
|
||||
success = await self.email_service.send_alert(
|
||||
recipients=self.recipients,
|
||||
subject=subject,
|
||||
body=body,
|
||||
)
|
||||
|
||||
if success:
|
||||
_LOGGER.info(
|
||||
"Email alert sent successfully: %d errors to %s",
|
||||
error_count,
|
||||
self.recipients,
|
||||
)
|
||||
else:
|
||||
_LOGGER.error("Failed to send email alert for %d errors", error_count)
|
||||
|
||||
except Exception:
|
||||
_LOGGER.exception("Exception while sending email alert")
|
||||
|
||||
def close(self) -> None:
|
||||
"""Close the handler and flush any remaining errors.
|
||||
|
||||
This is called when the logging system shuts down.
|
||||
"""
|
||||
# Cancel any pending flush tasks
|
||||
if self._flush_task and not self._flush_task.done():
|
||||
self._flush_task.cancel()
|
||||
|
||||
# Flush any remaining errors immediately
|
||||
if self.error_buffer and self.loop:
|
||||
try:
|
||||
# Check if the loop is still running
|
||||
if not self.loop.is_closed():
|
||||
future = asyncio.run_coroutine_threadsafe(
|
||||
self._flush_buffer(immediate=False),
|
||||
self.loop,
|
||||
)
|
||||
future.result(timeout=5)
|
||||
else:
|
||||
_LOGGER.warning(
|
||||
"Event loop closed, cannot flush %d remaining errors",
|
||||
len(self.error_buffer),
|
||||
)
|
||||
except Exception:
|
||||
_LOGGER.exception("Error flushing buffer on close")
|
||||
|
||||
super().close()
|
||||
|
||||
|
||||
class DailyReportScheduler:
|
||||
"""Scheduler for sending daily reports at configured times.
|
||||
|
||||
This runs as a background task and sends daily reports containing
|
||||
statistics and error summaries.
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
email_service: EmailService,
|
||||
config: dict[str, Any],
|
||||
):
|
||||
"""Initialize the daily report scheduler.
|
||||
|
||||
Args:
|
||||
email_service: Email service for sending reports
|
||||
config: Configuration for daily reports
|
||||
|
||||
"""
|
||||
self.email_service = email_service
|
||||
self.config = config
|
||||
self.recipients = config.get("recipients", [])
|
||||
self.send_time = config.get("send_time", "08:00") # Default 8 AM
|
||||
self.include_stats = config.get("include_stats", True)
|
||||
self.include_errors = config.get("include_errors", True)
|
||||
|
||||
self._task: asyncio.Task | None = None
|
||||
self._stats_collector = None # Will be set by application
|
||||
self._error_log: list[dict[str, Any]] = []
|
||||
|
||||
_LOGGER.info(
|
||||
"DailyReportScheduler initialized: send_time=%s, recipients=%s",
|
||||
self.send_time,
|
||||
self.recipients,
|
||||
)
|
||||
|
||||
def start(self) -> None:
|
||||
"""Start the daily report scheduler."""
|
||||
if self._task is None or self._task.done():
|
||||
self._task = asyncio.create_task(self._run())
|
||||
_LOGGER.info("Daily report scheduler started")
|
||||
|
||||
def stop(self) -> None:
|
||||
"""Stop the daily report scheduler."""
|
||||
if self._task and not self._task.done():
|
||||
self._task.cancel()
|
||||
_LOGGER.info("Daily report scheduler stopped")
|
||||
|
||||
def log_error(self, error: dict[str, Any]) -> None:
|
||||
"""Log an error for inclusion in daily report.
|
||||
|
||||
Args:
|
||||
error: Error information dictionary
|
||||
|
||||
"""
|
||||
self._error_log.append(error)
|
||||
|
||||
async def _run(self) -> None:
|
||||
"""Run the daily report scheduler loop."""
|
||||
while True:
|
||||
try:
|
||||
# Calculate time until next report
|
||||
now = datetime.now()
|
||||
target_hour, target_minute = map(int, self.send_time.split(":"))
|
||||
|
||||
# Calculate next send time
|
||||
next_send = now.replace(
|
||||
hour=target_hour,
|
||||
minute=target_minute,
|
||||
second=0,
|
||||
microsecond=0,
|
||||
)
|
||||
|
||||
# If time has passed today, schedule for tomorrow
|
||||
if next_send <= now:
|
||||
next_send += timedelta(days=1)
|
||||
|
||||
# Calculate sleep duration
|
||||
sleep_seconds = (next_send - now).total_seconds()
|
||||
|
||||
_LOGGER.info(
|
||||
"Next daily report scheduled for %s (in %.1f hours)",
|
||||
next_send.strftime("%Y-%m-%d %H:%M:%S"),
|
||||
sleep_seconds / 3600,
|
||||
)
|
||||
|
||||
# Wait until send time
|
||||
await asyncio.sleep(sleep_seconds)
|
||||
|
||||
# Send report
|
||||
await self._send_report()
|
||||
|
||||
except asyncio.CancelledError:
|
||||
_LOGGER.info("Daily report scheduler cancelled")
|
||||
break
|
||||
except Exception:
|
||||
_LOGGER.exception("Error in daily report scheduler")
|
||||
# Sleep a bit before retrying
|
||||
await asyncio.sleep(60)
|
||||
|
||||
async def _send_report(self) -> None:
|
||||
"""Send the daily report."""
|
||||
stats = {}
|
||||
|
||||
# Collect statistics if enabled
|
||||
if self.include_stats and self._stats_collector:
|
||||
try:
|
||||
stats = await self._stats_collector()
|
||||
except Exception:
|
||||
_LOGGER.exception("Error collecting statistics for daily report")
|
||||
|
||||
# Get errors if enabled
|
||||
errors = self._error_log.copy() if self.include_errors else None
|
||||
|
||||
# Send report
|
||||
try:
|
||||
success = await self.email_service.send_daily_report(
|
||||
recipients=self.recipients,
|
||||
stats=stats,
|
||||
errors=errors,
|
||||
)
|
||||
|
||||
if success:
|
||||
_LOGGER.info("Daily report sent successfully to %s", self.recipients)
|
||||
# Clear error log after successful send
|
||||
self._error_log.clear()
|
||||
else:
|
||||
_LOGGER.error("Failed to send daily report")
|
||||
|
||||
except Exception:
|
||||
_LOGGER.exception("Exception while sending daily report")
|
||||
|
||||
def set_stats_collector(self, collector) -> None:
|
||||
"""Set the statistics collector function.
|
||||
|
||||
Args:
|
||||
collector: Async function that returns statistics dictionary
|
||||
|
||||
"""
|
||||
self._stats_collector = collector
|
||||
|
||||
|
||||
class ReservationStatsCollector:
|
||||
"""Collects reservation statistics per hotel for daily reports.
|
||||
|
||||
This collector queries the database for reservations created since the last
|
||||
report and aggregates them by hotel. It includes hotel_code and hotel_name
|
||||
from the configuration.
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
async_sessionmaker: async_sessionmaker,
|
||||
config: dict[str, Any],
|
||||
):
|
||||
"""Initialize the stats collector.
|
||||
|
||||
Args:
|
||||
async_sessionmaker: SQLAlchemy async session maker
|
||||
config: Application configuration containing hotel information
|
||||
|
||||
"""
|
||||
self.async_sessionmaker = async_sessionmaker
|
||||
self.config = config
|
||||
self._last_report_time = datetime.now()
|
||||
|
||||
# Build hotel mapping from config
|
||||
self._hotel_map = {}
|
||||
for hotel in config.get("alpine_bits_auth", []):
|
||||
hotel_id = hotel.get("hotel_id")
|
||||
hotel_name = hotel.get("hotel_name")
|
||||
if hotel_id:
|
||||
self._hotel_map[hotel_id] = hotel_name or "Unknown Hotel"
|
||||
|
||||
_LOGGER.info(
|
||||
"ReservationStatsCollector initialized with %d hotels",
|
||||
len(self._hotel_map),
|
||||
)
|
||||
|
||||
async def collect_stats(self, lookback_hours: int | None = None) -> dict[str, Any]:
|
||||
"""Collect reservation statistics for the reporting period.
|
||||
|
||||
Args:
|
||||
lookback_hours: Optional override to look back N hours from now.
|
||||
If None, uses time since last report.
|
||||
|
||||
Returns:
|
||||
Dictionary with statistics including reservations per hotel
|
||||
|
||||
"""
|
||||
now = datetime.now()
|
||||
|
||||
if lookback_hours is not None:
|
||||
# Override mode: look back N hours from now
|
||||
period_start = now - timedelta(hours=lookback_hours)
|
||||
period_end = now
|
||||
else:
|
||||
# Normal mode: since last report
|
||||
period_start = self._last_report_time
|
||||
period_end = now
|
||||
|
||||
_LOGGER.info(
|
||||
"Collecting reservation stats from %s to %s",
|
||||
period_start.strftime("%Y-%m-%d %H:%M:%S"),
|
||||
period_end.strftime("%Y-%m-%d %H:%M:%S"),
|
||||
)
|
||||
|
||||
async with self.async_sessionmaker() as session:
|
||||
# Query reservations created in the reporting period
|
||||
result = await session.execute(
|
||||
select(Reservation.hotel_code, func.count(Reservation.id))
|
||||
.where(Reservation.created_at >= period_start)
|
||||
.where(Reservation.created_at < period_end)
|
||||
.group_by(Reservation.hotel_code)
|
||||
)
|
||||
|
||||
hotel_counts = dict(result.all())
|
||||
|
||||
# Build stats with hotel names from config
|
||||
hotels_stats = []
|
||||
total_reservations = 0
|
||||
|
||||
for hotel_code, count in hotel_counts.items():
|
||||
hotel_name = self._hotel_map.get(hotel_code, "Unknown Hotel")
|
||||
hotels_stats.append(
|
||||
{
|
||||
"hotel_code": hotel_code,
|
||||
"hotel_name": hotel_name,
|
||||
"reservations": count,
|
||||
}
|
||||
)
|
||||
total_reservations += count
|
||||
|
||||
# Sort by reservation count descending
|
||||
hotels_stats.sort(key=lambda x: x["reservations"], reverse=True)
|
||||
|
||||
# Update last report time only in normal mode (not lookback mode)
|
||||
if lookback_hours is None:
|
||||
self._last_report_time = now
|
||||
|
||||
stats = {
|
||||
"reporting_period": {
|
||||
"start": period_start.strftime("%Y-%m-%d %H:%M:%S"),
|
||||
"end": period_end.strftime("%Y-%m-%d %H:%M:%S"),
|
||||
},
|
||||
"total_reservations": total_reservations,
|
||||
"hotels": hotels_stats,
|
||||
}
|
||||
|
||||
_LOGGER.info(
|
||||
"Collected stats: %d total reservations across %d hotels",
|
||||
total_reservations,
|
||||
len(hotels_stats),
|
||||
)
|
||||
|
||||
return stats
|
||||
373
src/alpine_bits_python/email_service.py
Normal file
373
src/alpine_bits_python/email_service.py
Normal file
@@ -0,0 +1,373 @@
|
||||
"""Email service for sending alerts and reports.
|
||||
|
||||
This module provides email functionality for the AlpineBits application,
|
||||
including error alerts and daily reports.
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
import smtplib
|
||||
import ssl
|
||||
from concurrent.futures import ThreadPoolExecutor
|
||||
from datetime import datetime
|
||||
from email.mime.multipart import MIMEMultipart
|
||||
from email.mime.text import MIMEText
|
||||
from typing import Any
|
||||
|
||||
from pydantic import EmailStr, Field, field_validator
|
||||
|
||||
from .logging_config import get_logger
|
||||
|
||||
_LOGGER = get_logger(__name__)
|
||||
|
||||
|
||||
class EmailConfig:
|
||||
"""Configuration for email service.
|
||||
|
||||
Attributes:
|
||||
smtp_host: SMTP server hostname
|
||||
smtp_port: SMTP server port
|
||||
smtp_username: SMTP authentication username
|
||||
smtp_password: SMTP authentication password
|
||||
use_tls: Use STARTTLS for encryption
|
||||
use_ssl: Use SSL/TLS from the start
|
||||
from_address: Sender email address
|
||||
from_name: Sender display name
|
||||
timeout: Connection timeout in seconds
|
||||
|
||||
"""
|
||||
|
||||
def __init__(self, config: dict[str, Any]):
|
||||
"""Initialize email configuration from config dict.
|
||||
|
||||
Args:
|
||||
config: Email configuration dictionary
|
||||
|
||||
"""
|
||||
smtp_config = config.get("smtp", {})
|
||||
self.smtp_host: str = smtp_config.get("host", "localhost")
|
||||
self.smtp_port: int = smtp_config.get("port", 587)
|
||||
self.smtp_username: str | None = smtp_config.get("username")
|
||||
self.smtp_password: str | None = smtp_config.get("password")
|
||||
self.use_tls: bool = smtp_config.get("use_tls", True)
|
||||
self.use_ssl: bool = smtp_config.get("use_ssl", False)
|
||||
self.from_address: str = config.get("from_address", "noreply@example.com")
|
||||
self.from_name: str = config.get("from_name", "AlpineBits Server")
|
||||
self.timeout: int = config.get("timeout", 10)
|
||||
|
||||
# Validate configuration
|
||||
if self.use_tls and self.use_ssl:
|
||||
msg = "Cannot use both TLS and SSL"
|
||||
raise ValueError(msg)
|
||||
|
||||
|
||||
class EmailService:
|
||||
"""Service for sending emails via SMTP.
|
||||
|
||||
This service handles sending both plain text and HTML emails,
|
||||
with support for TLS/SSL encryption and authentication.
|
||||
"""
|
||||
|
||||
def __init__(self, config: EmailConfig):
|
||||
"""Initialize email service.
|
||||
|
||||
Args:
|
||||
config: Email configuration
|
||||
|
||||
"""
|
||||
self.config = config
|
||||
# Create dedicated thread pool for SMTP operations (max 2 threads is enough for email)
|
||||
# This prevents issues with default executor in multi-process environments
|
||||
self._executor = ThreadPoolExecutor(max_workers=2, thread_name_prefix="smtp-")
|
||||
|
||||
async def send_email(
|
||||
self,
|
||||
recipients: list[str],
|
||||
subject: str,
|
||||
body: str,
|
||||
html_body: str | None = None,
|
||||
) -> bool:
|
||||
"""Send an email to recipients.
|
||||
|
||||
Args:
|
||||
recipients: List of recipient email addresses
|
||||
subject: Email subject line
|
||||
body: Plain text email body
|
||||
html_body: Optional HTML email body
|
||||
|
||||
Returns:
|
||||
True if email was sent successfully, False otherwise
|
||||
|
||||
"""
|
||||
if not recipients:
|
||||
_LOGGER.warning("No recipients specified for email: %s", subject)
|
||||
return False
|
||||
|
||||
try:
|
||||
# Build message
|
||||
msg = MIMEMultipart("alternative")
|
||||
msg["Subject"] = subject
|
||||
msg["From"] = f"{self.config.from_name} <{self.config.from_address}>"
|
||||
msg["To"] = ", ".join(recipients)
|
||||
msg["Date"] = datetime.now().strftime("%a, %d %b %Y %H:%M:%S %z")
|
||||
|
||||
# Attach plain text body
|
||||
msg.attach(MIMEText(body, "plain"))
|
||||
|
||||
# Attach HTML body if provided
|
||||
if html_body:
|
||||
msg.attach(MIMEText(html_body, "html"))
|
||||
|
||||
# Send email in dedicated thread pool (SMTP is blocking)
|
||||
loop = asyncio.get_event_loop()
|
||||
await loop.run_in_executor(self._executor, self._send_smtp, msg, recipients)
|
||||
|
||||
_LOGGER.info("Email sent successfully to %s: %s", recipients, subject)
|
||||
return True
|
||||
|
||||
except Exception:
|
||||
_LOGGER.exception("Failed to send email to %s: %s", recipients, subject)
|
||||
return False
|
||||
|
||||
def _send_smtp(self, msg: MIMEMultipart, recipients: list[str]) -> None:
|
||||
"""Send email via SMTP (blocking operation).
|
||||
|
||||
Args:
|
||||
msg: Email message to send
|
||||
recipients: List of recipient addresses
|
||||
|
||||
Raises:
|
||||
Exception: If email sending fails
|
||||
|
||||
"""
|
||||
if self.config.use_ssl:
|
||||
# Connect with SSL from the start
|
||||
context = ssl.create_default_context()
|
||||
with smtplib.SMTP_SSL(
|
||||
self.config.smtp_host,
|
||||
self.config.smtp_port,
|
||||
timeout=self.config.timeout,
|
||||
context=context,
|
||||
) as server:
|
||||
if self.config.smtp_username and self.config.smtp_password:
|
||||
server.login(self.config.smtp_username, self.config.smtp_password)
|
||||
server.send_message(msg, self.config.from_address, recipients)
|
||||
else:
|
||||
# Connect and optionally upgrade to TLS
|
||||
with smtplib.SMTP(
|
||||
self.config.smtp_host,
|
||||
self.config.smtp_port,
|
||||
timeout=self.config.timeout,
|
||||
) as server:
|
||||
if self.config.use_tls:
|
||||
context = ssl.create_default_context()
|
||||
server.starttls(context=context)
|
||||
|
||||
if self.config.smtp_username and self.config.smtp_password:
|
||||
server.login(self.config.smtp_username, self.config.smtp_password)
|
||||
|
||||
server.send_message(msg, self.config.from_address, recipients)
|
||||
|
||||
async def send_alert(
|
||||
self,
|
||||
recipients: list[str],
|
||||
subject: str,
|
||||
body: str,
|
||||
) -> bool:
|
||||
"""Send an alert email (convenience method).
|
||||
|
||||
Args:
|
||||
recipients: List of recipient email addresses
|
||||
subject: Email subject line
|
||||
body: Email body text
|
||||
|
||||
Returns:
|
||||
True if email was sent successfully, False otherwise
|
||||
|
||||
"""
|
||||
return await self.send_email(recipients, subject, body)
|
||||
|
||||
async def send_daily_report(
|
||||
self,
|
||||
recipients: list[str],
|
||||
stats: dict[str, Any],
|
||||
errors: list[dict[str, Any]] | None = None,
|
||||
) -> bool:
|
||||
"""Send a daily report email.
|
||||
|
||||
Args:
|
||||
recipients: List of recipient email addresses
|
||||
stats: Dictionary containing statistics to include in report
|
||||
errors: Optional list of errors to include
|
||||
|
||||
Returns:
|
||||
True if email was sent successfully, False otherwise
|
||||
|
||||
"""
|
||||
date_str = datetime.now().strftime("%Y-%m-%d")
|
||||
subject = f"AlpineBits Daily Report - {date_str}"
|
||||
|
||||
# Build plain text body
|
||||
body = f"AlpineBits Daily Report for {date_str}\n"
|
||||
body += "=" * 60 + "\n\n"
|
||||
|
||||
# Add statistics
|
||||
if stats:
|
||||
body += "Statistics:\n"
|
||||
body += "-" * 60 + "\n"
|
||||
for key, value in stats.items():
|
||||
body += f" {key}: {value}\n"
|
||||
body += "\n"
|
||||
|
||||
# Add errors if present
|
||||
if errors:
|
||||
body += f"Errors ({len(errors)}):\n"
|
||||
body += "-" * 60 + "\n"
|
||||
for error in errors[:20]: # Limit to 20 most recent errors
|
||||
timestamp = error.get("timestamp", "Unknown")
|
||||
level = error.get("level", "ERROR")
|
||||
message = error.get("message", "No message")
|
||||
body += f" [{timestamp}] {level}: {message}\n"
|
||||
if len(errors) > 20:
|
||||
body += f" ... and {len(errors) - 20} more errors\n"
|
||||
body += "\n"
|
||||
|
||||
body += "-" * 60 + "\n"
|
||||
body += "Generated by AlpineBits Server\n"
|
||||
|
||||
# Build HTML body for better formatting
|
||||
html_body = self._build_daily_report_html(date_str, stats, errors)
|
||||
|
||||
return await self.send_email(recipients, subject, body, html_body)
|
||||
|
||||
def _build_daily_report_html(
|
||||
self,
|
||||
date_str: str,
|
||||
stats: dict[str, Any],
|
||||
errors: list[dict[str, Any]] | None,
|
||||
) -> str:
|
||||
"""Build HTML version of daily report.
|
||||
|
||||
Args:
|
||||
date_str: Date string for the report
|
||||
stats: Statistics dictionary
|
||||
errors: Optional list of errors
|
||||
|
||||
Returns:
|
||||
HTML string for the email body
|
||||
|
||||
"""
|
||||
html = f"""
|
||||
<html>
|
||||
<head>
|
||||
<style>
|
||||
body {{ font-family: Arial, sans-serif; }}
|
||||
h1 {{ color: #333; }}
|
||||
h2 {{ color: #666; margin-top: 20px; }}
|
||||
table {{ border-collapse: collapse; width: 100%; }}
|
||||
th, td {{ text-align: left; padding: 8px; border-bottom: 1px solid #ddd; }}
|
||||
th {{ background-color: #f2f2f2; }}
|
||||
.error {{ color: #d32f2f; }}
|
||||
.warning {{ color: #f57c00; }}
|
||||
.footer {{ margin-top: 30px; color: #999; font-size: 12px; }}
|
||||
</style>
|
||||
</head>
|
||||
<body>
|
||||
<h1>AlpineBits Daily Report</h1>
|
||||
<p><strong>Date:</strong> {date_str}</p>
|
||||
"""
|
||||
|
||||
# Add statistics table
|
||||
if stats:
|
||||
html += """
|
||||
<h2>Statistics</h2>
|
||||
<table>
|
||||
<tr>
|
||||
<th>Metric</th>
|
||||
<th>Value</th>
|
||||
</tr>
|
||||
"""
|
||||
for key, value in stats.items():
|
||||
html += f"""
|
||||
<tr>
|
||||
<td>{key}</td>
|
||||
<td>{value}</td>
|
||||
</tr>
|
||||
"""
|
||||
html += "</table>"
|
||||
|
||||
# Add errors table
|
||||
if errors:
|
||||
html += f"""
|
||||
<h2>Errors ({len(errors)})</h2>
|
||||
<table>
|
||||
<tr>
|
||||
<th>Time</th>
|
||||
<th>Level</th>
|
||||
<th>Message</th>
|
||||
</tr>
|
||||
"""
|
||||
for error in errors[:20]: # Limit to 20 most recent
|
||||
timestamp = error.get("timestamp", "Unknown")
|
||||
level = error.get("level", "ERROR")
|
||||
message = error.get("message", "No message")
|
||||
css_class = "error" if level == "ERROR" or level == "CRITICAL" else "warning"
|
||||
html += f"""
|
||||
<tr>
|
||||
<td>{timestamp}</td>
|
||||
<td class="{css_class}">{level}</td>
|
||||
<td>{message}</td>
|
||||
</tr>
|
||||
"""
|
||||
if len(errors) > 20:
|
||||
html += f"""
|
||||
<tr>
|
||||
<td colspan="3"><em>... and {len(errors) - 20} more errors</em></td>
|
||||
</tr>
|
||||
"""
|
||||
html += "</table>"
|
||||
|
||||
html += """
|
||||
<div class="footer">
|
||||
<p>Generated by AlpineBits Server</p>
|
||||
</div>
|
||||
</body>
|
||||
</html>
|
||||
"""
|
||||
|
||||
return html
|
||||
|
||||
def shutdown(self) -> None:
|
||||
"""Shutdown the email service and clean up thread pool.
|
||||
|
||||
This should be called during application shutdown to ensure
|
||||
proper cleanup of the thread pool executor.
|
||||
"""
|
||||
if self._executor:
|
||||
_LOGGER.info("Shutting down email service thread pool")
|
||||
self._executor.shutdown(wait=True, cancel_futures=False)
|
||||
_LOGGER.info("Email service thread pool shut down complete")
|
||||
|
||||
|
||||
def create_email_service(config: dict[str, Any]) -> EmailService | None:
|
||||
"""Create an email service from configuration.
|
||||
|
||||
Args:
|
||||
config: Full application configuration dictionary
|
||||
|
||||
Returns:
|
||||
EmailService instance if email is configured, None otherwise
|
||||
|
||||
"""
|
||||
email_config = config.get("email")
|
||||
if not email_config:
|
||||
_LOGGER.info("Email not configured, email service disabled")
|
||||
return None
|
||||
|
||||
try:
|
||||
email_cfg = EmailConfig(email_config)
|
||||
service = EmailService(email_cfg)
|
||||
_LOGGER.info("Email service initialized: %s:%s", email_cfg.smtp_host, email_cfg.smtp_port)
|
||||
return service
|
||||
except Exception:
|
||||
_LOGGER.exception("Failed to initialize email service")
|
||||
return None
|
||||
@@ -4,16 +4,41 @@ This module sets up logging based on config and provides a function to get
|
||||
loggers from anywhere in the application.
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
import logging
|
||||
import sys
|
||||
from pathlib import Path
|
||||
from typing import TYPE_CHECKING
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from alpine_bits_python.email_monitoring import (
|
||||
DailyReportScheduler,
|
||||
EmailAlertHandler,
|
||||
)
|
||||
from alpine_bits_python.email_service import EmailService
|
||||
from alpine_bits_python.pushover_service import PushoverService
|
||||
|
||||
|
||||
def setup_logging(config: dict | None = None):
|
||||
def setup_logging(
|
||||
config: dict | None = None,
|
||||
email_service: "EmailService | None" = None,
|
||||
pushover_service: "PushoverService | None" = None,
|
||||
loop: asyncio.AbstractEventLoop | None = None,
|
||||
enable_scheduler: bool = True,
|
||||
) -> tuple[logging.Handler | None, object | None]:
|
||||
"""Configure logging based on application config.
|
||||
|
||||
Args:
|
||||
config: Application configuration dict with optional 'logger' section
|
||||
email_service: Optional email service for email alerts
|
||||
pushover_service: Optional pushover service for push notifications
|
||||
loop: Optional asyncio event loop for email alerts
|
||||
enable_scheduler: Whether to enable the daily report scheduler
|
||||
(should be False for non-primary workers)
|
||||
|
||||
Returns:
|
||||
Tuple of (alert_handler, daily_report_scheduler) if monitoring
|
||||
is enabled, otherwise (None, None)
|
||||
|
||||
Logger config format:
|
||||
logger:
|
||||
@@ -67,6 +92,68 @@ def setup_logging(config: dict | None = None):
|
||||
|
||||
root_logger.info("Logging configured at %s level", level)
|
||||
|
||||
# Setup unified notification monitoring if configured
|
||||
alert_handler = None
|
||||
report_scheduler = None
|
||||
|
||||
# Check if unified notifications are configured
|
||||
notifications_config = config.get("notifications", {})
|
||||
if notifications_config and (email_service or pushover_service):
|
||||
try:
|
||||
# Import here to avoid circular dependencies
|
||||
from alpine_bits_python.notification_manager import (
|
||||
get_notification_config,
|
||||
setup_notification_service,
|
||||
)
|
||||
from alpine_bits_python.unified_monitoring import (
|
||||
UnifiedAlertHandler,
|
||||
UnifiedDailyReportScheduler,
|
||||
)
|
||||
|
||||
# Setup unified notification service
|
||||
notification_service = setup_notification_service(
|
||||
config=config,
|
||||
email_service=email_service,
|
||||
pushover_service=pushover_service,
|
||||
)
|
||||
|
||||
if notification_service:
|
||||
# Setup error alert handler
|
||||
error_alerts_config = get_notification_config("error_alerts", config)
|
||||
if error_alerts_config.get("enabled", False):
|
||||
try:
|
||||
alert_handler = UnifiedAlertHandler(
|
||||
notification_service=notification_service,
|
||||
config=error_alerts_config,
|
||||
loop=loop,
|
||||
)
|
||||
alert_handler.setLevel(logging.ERROR)
|
||||
root_logger.addHandler(alert_handler)
|
||||
root_logger.info("Unified alert handler enabled for error monitoring")
|
||||
except Exception:
|
||||
root_logger.exception("Failed to setup unified alert handler")
|
||||
|
||||
# Setup daily report scheduler (only if enabled and this is primary worker)
|
||||
daily_report_config = get_notification_config("daily_report", config)
|
||||
if daily_report_config.get("enabled", False) and enable_scheduler:
|
||||
try:
|
||||
report_scheduler = UnifiedDailyReportScheduler(
|
||||
notification_service=notification_service,
|
||||
config=daily_report_config,
|
||||
)
|
||||
root_logger.info("Unified daily report scheduler configured (primary worker)")
|
||||
except Exception:
|
||||
root_logger.exception("Failed to setup unified daily report scheduler")
|
||||
elif daily_report_config.get("enabled", False) and not enable_scheduler:
|
||||
root_logger.info(
|
||||
"Unified daily report scheduler disabled (non-primary worker)"
|
||||
)
|
||||
|
||||
except Exception:
|
||||
root_logger.exception("Failed to setup unified notification monitoring")
|
||||
|
||||
return alert_handler, report_scheduler
|
||||
|
||||
|
||||
def get_logger(name: str) -> logging.Logger:
|
||||
"""Get a logger instance for the given module name.
|
||||
|
||||
365
src/alpine_bits_python/migrations.py
Normal file
365
src/alpine_bits_python/migrations.py
Normal file
@@ -0,0 +1,365 @@
|
||||
"""Database migrations for AlpineBits.
|
||||
|
||||
This module contains migration functions that are automatically run at app startup
|
||||
to update existing database schemas without losing data.
|
||||
"""
|
||||
|
||||
from typing import Any
|
||||
|
||||
from sqlalchemy import inspect, text
|
||||
from sqlalchemy.ext.asyncio import AsyncEngine
|
||||
|
||||
from .const import CONF_GOOGLE_ACCOUNT, CONF_HOTEL_ID, CONF_META_ACCOUNT
|
||||
from .logging_config import get_logger
|
||||
from .db import Reservation
|
||||
|
||||
_LOGGER = get_logger(__name__)
|
||||
|
||||
|
||||
async def check_column_exists(engine: AsyncEngine, table_name: str, column_name: str) -> bool:
|
||||
"""Check if a column exists in a table.
|
||||
|
||||
Args:
|
||||
engine: SQLAlchemy async engine
|
||||
table_name: Name of the table to check
|
||||
column_name: Name of the column to check
|
||||
|
||||
Returns:
|
||||
True if column exists, False otherwise
|
||||
"""
|
||||
async with engine.connect() as conn:
|
||||
def _check(connection):
|
||||
inspector = inspect(connection)
|
||||
columns = [col['name'] for col in inspector.get_columns(table_name)]
|
||||
return column_name in columns
|
||||
|
||||
result = await conn.run_sync(_check)
|
||||
return result
|
||||
|
||||
|
||||
async def add_column_if_not_exists(
|
||||
engine: AsyncEngine,
|
||||
table_name: str,
|
||||
column_name: str,
|
||||
column_type: str = "VARCHAR"
|
||||
) -> bool:
|
||||
"""Add a column to a table if it doesn't already exist.
|
||||
|
||||
Args:
|
||||
engine: SQLAlchemy async engine
|
||||
table_name: Name of the table
|
||||
column_name: Name of the column to add
|
||||
column_type: SQL type of the column (default: VARCHAR)
|
||||
|
||||
Returns:
|
||||
True if column was added, False if it already existed
|
||||
"""
|
||||
exists = await check_column_exists(engine, table_name, column_name)
|
||||
|
||||
if exists:
|
||||
_LOGGER.debug("Column %s.%s already exists, skipping", table_name, column_name)
|
||||
return False
|
||||
|
||||
_LOGGER.info("Adding column %s.%s (%s)", table_name, column_name, column_type)
|
||||
|
||||
async with engine.begin() as conn:
|
||||
sql = f"ALTER TABLE {table_name} ADD COLUMN {column_name} {column_type}"
|
||||
await conn.execute(text(sql))
|
||||
|
||||
_LOGGER.info("Successfully added column %s.%s", table_name, column_name)
|
||||
return True
|
||||
|
||||
|
||||
async def migrate_add_room_types(engine: AsyncEngine) -> None:
|
||||
"""Migration: Add RoomTypes fields to reservations table.
|
||||
|
||||
This migration adds three optional fields:
|
||||
- room_type_code: String (max 8 chars)
|
||||
- room_classification_code: String (numeric pattern)
|
||||
- room_type: String (enum: 1-5)
|
||||
|
||||
Safe to run multiple times - will skip if columns already exist.
|
||||
"""
|
||||
_LOGGER.info("Running migration: add_room_types")
|
||||
|
||||
added_count = 0
|
||||
|
||||
# Add each column if it doesn't exist
|
||||
if await add_column_if_not_exists(engine, "reservations", "room_type_code", "VARCHAR"):
|
||||
added_count += 1
|
||||
|
||||
if await add_column_if_not_exists(engine, "reservations", "room_classification_code", "VARCHAR"):
|
||||
added_count += 1
|
||||
|
||||
if await add_column_if_not_exists(engine, "reservations", "room_type", "VARCHAR"):
|
||||
added_count += 1
|
||||
|
||||
if added_count > 0:
|
||||
_LOGGER.info("Migration add_room_types: Added %d columns", added_count)
|
||||
else:
|
||||
_LOGGER.info("Migration add_room_types: No changes needed (already applied)")
|
||||
|
||||
|
||||
async def migrate_add_advertising_account_ids(engine: AsyncEngine, config: dict[str, Any] | None = None) -> None:
|
||||
"""Migration: Add advertising account ID fields to reservations table.
|
||||
|
||||
This migration adds two optional fields:
|
||||
- meta_account_id: String (Meta/Facebook advertising account ID)
|
||||
- google_account_id: String (Google advertising account ID)
|
||||
|
||||
These fields are populated conditionally based on fbclid/gclid presence.
|
||||
For existing reservations, backfills account IDs from config based on hotel_code and fbclid/gclid.
|
||||
Safe to run multiple times - will skip if columns already exist.
|
||||
|
||||
Args:
|
||||
engine: SQLAlchemy async engine
|
||||
config: Application configuration dict containing hotel account IDs
|
||||
"""
|
||||
_LOGGER.info("Running migration: add_advertising_account_ids")
|
||||
|
||||
added_count = 0
|
||||
|
||||
# Add each column if it doesn't exist
|
||||
if await add_column_if_not_exists(engine, "reservations", "meta_account_id", "VARCHAR"):
|
||||
added_count += 1
|
||||
|
||||
if await add_column_if_not_exists(engine, "reservations", "google_account_id", "VARCHAR"):
|
||||
added_count += 1
|
||||
|
||||
if added_count > 0:
|
||||
_LOGGER.info("Migration add_advertising_account_ids: Added %d columns", added_count)
|
||||
else:
|
||||
_LOGGER.info("Migration add_advertising_account_ids: Columns already exist")
|
||||
|
||||
# Backfill existing reservations with account IDs based on config and fbclid/gclid presence
|
||||
if config:
|
||||
await _backfill_advertising_account_ids(engine, config)
|
||||
else:
|
||||
_LOGGER.warning("No config provided, skipping backfill of advertising account IDs")
|
||||
|
||||
|
||||
async def _backfill_advertising_account_ids(engine: AsyncEngine, config: dict[str, Any]) -> None:
|
||||
"""Backfill advertising account IDs for existing reservations.
|
||||
|
||||
Updates existing reservations to populate meta_account_id and google_account_id
|
||||
based on the conditional logic:
|
||||
- If fbclid is present, set meta_account_id from hotel config
|
||||
- If gclid is present, set google_account_id from hotel config
|
||||
|
||||
Args:
|
||||
engine: SQLAlchemy async engine
|
||||
config: Application configuration dict
|
||||
"""
|
||||
_LOGGER.info("Backfilling advertising account IDs for existing reservations...")
|
||||
|
||||
# Build a mapping of hotel_id -> account IDs from config
|
||||
hotel_accounts = {}
|
||||
alpine_bits_auth = config.get("alpine_bits_auth", [])
|
||||
|
||||
for hotel in alpine_bits_auth:
|
||||
hotel_id = hotel.get(CONF_HOTEL_ID)
|
||||
meta_account = hotel.get(CONF_META_ACCOUNT)
|
||||
google_account = hotel.get(CONF_GOOGLE_ACCOUNT)
|
||||
|
||||
if hotel_id:
|
||||
hotel_accounts[hotel_id] = {
|
||||
"meta_account": meta_account,
|
||||
"google_account": google_account
|
||||
}
|
||||
|
||||
if not hotel_accounts:
|
||||
_LOGGER.info("No hotel accounts found in config, skipping backfill")
|
||||
return
|
||||
|
||||
_LOGGER.info("Found %d hotel(s) with account configurations", len(hotel_accounts))
|
||||
|
||||
# Update reservations with meta_account_id where fbclid is present
|
||||
meta_updated = 0
|
||||
for hotel_id, accounts in hotel_accounts.items():
|
||||
if accounts["meta_account"]:
|
||||
async with engine.begin() as conn:
|
||||
sql = text(
|
||||
"UPDATE reservations "
|
||||
"SET meta_account_id = :meta_account "
|
||||
"WHERE hotel_code = :hotel_id "
|
||||
"AND fbclid IS NOT NULL "
|
||||
"AND fbclid != '' "
|
||||
"AND (meta_account_id IS NULL OR meta_account_id = '')"
|
||||
)
|
||||
result = await conn.execute(
|
||||
sql,
|
||||
{"meta_account": accounts["meta_account"], "hotel_id": hotel_id}
|
||||
)
|
||||
count = result.rowcount
|
||||
if count > 0:
|
||||
_LOGGER.info("Updated %d reservations with meta_account_id for hotel %s", count, hotel_id)
|
||||
meta_updated += count
|
||||
|
||||
# Update reservations with google_account_id where gclid is present
|
||||
google_updated = 0
|
||||
for hotel_id, accounts in hotel_accounts.items():
|
||||
if accounts["google_account"]:
|
||||
async with engine.begin() as conn:
|
||||
sql = text(
|
||||
"UPDATE reservations "
|
||||
"SET google_account_id = :google_account "
|
||||
"WHERE hotel_code = :hotel_id "
|
||||
"AND gclid IS NOT NULL "
|
||||
"AND gclid != '' "
|
||||
"AND (google_account_id IS NULL OR google_account_id = '')"
|
||||
)
|
||||
result = await conn.execute(
|
||||
sql,
|
||||
{"google_account": accounts["google_account"], "hotel_id": hotel_id}
|
||||
)
|
||||
count = result.rowcount
|
||||
if count > 0:
|
||||
_LOGGER.info("Updated %d reservations with google_account_id for hotel %s", count, hotel_id)
|
||||
google_updated += count
|
||||
|
||||
_LOGGER.info(
|
||||
"Backfill complete: %d reservations updated with meta_account_id, %d with google_account_id",
|
||||
meta_updated,
|
||||
google_updated
|
||||
)
|
||||
|
||||
|
||||
async def migrate_add_username_to_acked_requests(engine: AsyncEngine, config: dict[str, Any] | None = None) -> None:
|
||||
"""Migration: Add username column to acked_requests table and backfill with hotel usernames.
|
||||
|
||||
This migration adds a username column to acked_requests to track acknowledgements by username
|
||||
instead of just client_id. This improves consistency since client_ids can change but usernames are stable.
|
||||
|
||||
For existing acknowledgements, this migration queries reservations to determine the hotel_code,
|
||||
then looks up the corresponding username from the config and populates the new column.
|
||||
|
||||
Safe to run multiple times - will skip if column already exists.
|
||||
|
||||
Args:
|
||||
engine: SQLAlchemy async engine
|
||||
config: Application configuration dict containing hotel usernames
|
||||
"""
|
||||
_LOGGER.info("Running migration: add_username_to_acked_requests")
|
||||
|
||||
# Add the username column if it doesn't exist
|
||||
if await add_column_if_not_exists(engine, "acked_requests", "username", "VARCHAR"):
|
||||
_LOGGER.info("Added username column to acked_requests table")
|
||||
else:
|
||||
_LOGGER.info("Username column already exists in acked_requests, skipping")
|
||||
return
|
||||
|
||||
# Backfill existing acknowledgements with username from config
|
||||
if config:
|
||||
await _backfill_acked_requests_username(engine, config)
|
||||
else:
|
||||
_LOGGER.warning("No config provided, skipping backfill of acked_requests usernames")
|
||||
|
||||
|
||||
async def _backfill_acked_requests_username(engine: AsyncEngine, config: dict[str, Any]) -> None:
|
||||
"""Backfill username for existing acked_requests records.
|
||||
|
||||
For each acknowledgement, find the corresponding reservation to determine its hotel_code,
|
||||
then look up the username for that hotel in the config and update the acked_request record.
|
||||
|
||||
Args:
|
||||
engine: SQLAlchemy async engine
|
||||
config: Application configuration dict
|
||||
"""
|
||||
_LOGGER.info("Backfilling usernames for existing acked_requests...")
|
||||
|
||||
# Build a mapping of hotel_id -> username from config
|
||||
hotel_usernames = {}
|
||||
alpine_bits_auth = config.get("alpine_bits_auth", [])
|
||||
|
||||
for hotel in alpine_bits_auth:
|
||||
hotel_id = hotel.get(CONF_HOTEL_ID)
|
||||
username = hotel.get("username")
|
||||
|
||||
if hotel_id and username:
|
||||
hotel_usernames[hotel_id] = username
|
||||
|
||||
if not hotel_usernames:
|
||||
_LOGGER.info("No hotel usernames found in config, skipping backfill")
|
||||
return
|
||||
|
||||
_LOGGER.info("Found %d hotel(s) with usernames in config", len(hotel_usernames))
|
||||
|
||||
# Update acked_requests with usernames by matching to reservations
|
||||
total_updated = 0
|
||||
async with engine.begin() as conn:
|
||||
for hotel_id, username in hotel_usernames.items():
|
||||
sql = text("""
|
||||
UPDATE acked_requests
|
||||
SET username = :username
|
||||
WHERE unique_id IN (
|
||||
SELECT md5_unique_id FROM reservations WHERE hotel_code = :hotel_id
|
||||
)
|
||||
AND username IS NULL
|
||||
""")
|
||||
result = await conn.execute(
|
||||
sql,
|
||||
{"username": username, "hotel_id": hotel_id}
|
||||
)
|
||||
count = result.rowcount
|
||||
if count > 0:
|
||||
_LOGGER.info("Updated %d acknowledgements with username for hotel %s", count, hotel_id)
|
||||
total_updated += count
|
||||
|
||||
_LOGGER.info("Backfill complete: %d acknowledgements updated with username", total_updated)
|
||||
|
||||
|
||||
async def migrate_normalize_conversions(engine: AsyncEngine) -> None:
|
||||
"""Migration: Normalize conversions and room reservations structure.
|
||||
|
||||
This migration redesigns the conversion data structure:
|
||||
- conversions: One row per PMS reservation (with guest/advertising metadata)
|
||||
- room_reservations: One row per room reservation (linked to conversion)
|
||||
- daily_sales: JSON array of daily sales within each room reservation
|
||||
- total_revenue: Extracted sum of all daily sales for efficiency
|
||||
|
||||
Old structure: One row per daily sale (denormalized, lots of duplication)
|
||||
New structure: One row per room reservation, daily sales as JSON with extracted total
|
||||
|
||||
This allows:
|
||||
- Upserts on room reservations (same room doesn't get duplicated)
|
||||
- Better tracking of room data separate from daily sales data
|
||||
- Efficient querying via extracted total_revenue field
|
||||
- All daily sales details preserved in JSON for analysis
|
||||
|
||||
The tables are created via Base.metadata.create_all() at startup.
|
||||
|
||||
Safe to run multiple times - idempotent.
|
||||
"""
|
||||
_LOGGER.info("Running migration: normalize_conversions")
|
||||
_LOGGER.info(
|
||||
"Conversion data structure redesigned: "
|
||||
"conversions (1 per PMS reservation) + "
|
||||
"room_reservations (1 per room, daily_sales as JSON). "
|
||||
"Tables created/updated via Base.metadata.create_all()"
|
||||
)
|
||||
|
||||
|
||||
async def run_all_migrations(engine: AsyncEngine, config: dict[str, Any] | None = None) -> None:
|
||||
"""Run all pending migrations.
|
||||
|
||||
This function should be called at app startup, after Base.metadata.create_all.
|
||||
Each migration function should be idempotent (safe to run multiple times).
|
||||
|
||||
Args:
|
||||
engine: SQLAlchemy async engine
|
||||
config: Application configuration dict (optional, but required for some migrations)
|
||||
"""
|
||||
_LOGGER.info("Starting database migrations...")
|
||||
|
||||
try:
|
||||
# Add new migrations here in chronological order
|
||||
await migrate_add_room_types(engine)
|
||||
await migrate_add_advertising_account_ids(engine, config)
|
||||
await migrate_add_username_to_acked_requests(engine, config)
|
||||
await migrate_normalize_conversions(engine)
|
||||
|
||||
_LOGGER.info("Database migrations completed successfully")
|
||||
|
||||
except Exception as e:
|
||||
_LOGGER.exception("Migration failed: %s", e)
|
||||
raise
|
||||
127
src/alpine_bits_python/notification_adapters.py
Normal file
127
src/alpine_bits_python/notification_adapters.py
Normal file
@@ -0,0 +1,127 @@
|
||||
"""Adapters for notification backends.
|
||||
|
||||
This module provides adapters that wrap email and Pushover services
|
||||
to work with the unified notification service interface.
|
||||
"""
|
||||
|
||||
from typing import Any
|
||||
|
||||
from .email_service import EmailService
|
||||
from .logging_config import get_logger
|
||||
from .pushover_service import PushoverService
|
||||
|
||||
_LOGGER = get_logger(__name__)
|
||||
|
||||
|
||||
class EmailNotificationAdapter:
|
||||
"""Adapter for EmailService to work with NotificationService."""
|
||||
|
||||
def __init__(self, email_service: EmailService, recipients: list[str]):
|
||||
"""Initialize the email notification adapter.
|
||||
|
||||
Args:
|
||||
email_service: EmailService instance
|
||||
recipients: List of recipient email addresses
|
||||
|
||||
"""
|
||||
self.email_service = email_service
|
||||
self.recipients = recipients
|
||||
|
||||
async def send_alert(self, title: str, message: str, **kwargs) -> bool:
|
||||
"""Send an alert via email.
|
||||
|
||||
Args:
|
||||
title: Email subject
|
||||
message: Email body
|
||||
**kwargs: Ignored for email
|
||||
|
||||
Returns:
|
||||
True if sent successfully
|
||||
|
||||
"""
|
||||
return await self.email_service.send_alert(
|
||||
recipients=self.recipients,
|
||||
subject=title,
|
||||
body=message,
|
||||
)
|
||||
|
||||
async def send_daily_report(
|
||||
self,
|
||||
stats: dict[str, Any],
|
||||
errors: list[dict[str, Any]] | None = None,
|
||||
**kwargs,
|
||||
) -> bool:
|
||||
"""Send a daily report via email.
|
||||
|
||||
Args:
|
||||
stats: Statistics dictionary
|
||||
errors: Optional list of errors
|
||||
**kwargs: Ignored for email
|
||||
|
||||
Returns:
|
||||
True if sent successfully
|
||||
|
||||
"""
|
||||
return await self.email_service.send_daily_report(
|
||||
recipients=self.recipients,
|
||||
stats=stats,
|
||||
errors=errors,
|
||||
)
|
||||
|
||||
|
||||
class PushoverNotificationAdapter:
|
||||
"""Adapter for PushoverService to work with NotificationService."""
|
||||
|
||||
def __init__(self, pushover_service: PushoverService, priority: int = 0):
|
||||
"""Initialize the Pushover notification adapter.
|
||||
|
||||
Args:
|
||||
pushover_service: PushoverService instance
|
||||
priority: Default priority level for notifications
|
||||
|
||||
"""
|
||||
self.pushover_service = pushover_service
|
||||
self.priority = priority
|
||||
|
||||
async def send_alert(self, title: str, message: str, **kwargs) -> bool:
|
||||
"""Send an alert via Pushover.
|
||||
|
||||
Args:
|
||||
title: Notification title
|
||||
message: Notification message
|
||||
**kwargs: Can include 'priority' to override default
|
||||
|
||||
Returns:
|
||||
True if sent successfully
|
||||
|
||||
"""
|
||||
priority = kwargs.get("priority", self.priority)
|
||||
return await self.pushover_service.send_alert(
|
||||
title=title,
|
||||
message=message,
|
||||
priority=priority,
|
||||
)
|
||||
|
||||
async def send_daily_report(
|
||||
self,
|
||||
stats: dict[str, Any],
|
||||
errors: list[dict[str, Any]] | None = None,
|
||||
**kwargs,
|
||||
) -> bool:
|
||||
"""Send a daily report via Pushover.
|
||||
|
||||
Args:
|
||||
stats: Statistics dictionary
|
||||
errors: Optional list of errors
|
||||
**kwargs: Can include 'priority' to override default
|
||||
|
||||
Returns:
|
||||
True if sent successfully
|
||||
|
||||
"""
|
||||
priority = kwargs.get("priority", self.priority)
|
||||
return await self.pushover_service.send_daily_report(
|
||||
stats=stats,
|
||||
errors=errors,
|
||||
priority=priority,
|
||||
)
|
||||
156
src/alpine_bits_python/notification_manager.py
Normal file
156
src/alpine_bits_python/notification_manager.py
Normal file
@@ -0,0 +1,156 @@
|
||||
"""Unified notification manager for setting up recipient-based notification routing.
|
||||
|
||||
This module provides helpers to initialize the unified notification system
|
||||
based on the recipients configuration.
|
||||
"""
|
||||
|
||||
from typing import Any
|
||||
|
||||
from .email_service import EmailService
|
||||
from .logging_config import get_logger
|
||||
from .notification_adapters import EmailNotificationAdapter, PushoverNotificationAdapter
|
||||
from .notification_service import NotificationService
|
||||
from .pushover_service import PushoverService
|
||||
|
||||
_LOGGER = get_logger(__name__)
|
||||
|
||||
|
||||
def setup_notification_service(
|
||||
config: dict[str, Any],
|
||||
email_service: EmailService | None = None,
|
||||
pushover_service: PushoverService | None = None,
|
||||
) -> NotificationService | None:
|
||||
"""Set up unified notification service from config.
|
||||
|
||||
Args:
|
||||
config: Full configuration dictionary
|
||||
email_service: Optional EmailService instance
|
||||
pushover_service: Optional PushoverService instance
|
||||
|
||||
Returns:
|
||||
NotificationService instance, or None if no recipients configured
|
||||
|
||||
"""
|
||||
notifications_config = config.get("notifications", {})
|
||||
recipients = notifications_config.get("recipients", [])
|
||||
|
||||
if not recipients:
|
||||
_LOGGER.info("No notification recipients configured")
|
||||
return None
|
||||
|
||||
notification_service = NotificationService()
|
||||
|
||||
# Process each recipient and their methods
|
||||
for recipient in recipients:
|
||||
recipient_name = recipient.get("name", "unknown")
|
||||
methods = recipient.get("methods", [])
|
||||
|
||||
for method in methods:
|
||||
method_type = method.get("type")
|
||||
|
||||
if method_type == "email":
|
||||
if not email_service:
|
||||
_LOGGER.warning(
|
||||
"Email method configured for %s but email service not available",
|
||||
recipient_name,
|
||||
)
|
||||
continue
|
||||
|
||||
email_address = method.get("address")
|
||||
if not email_address:
|
||||
_LOGGER.warning(
|
||||
"Email method for %s missing address", recipient_name
|
||||
)
|
||||
continue
|
||||
|
||||
# Create a unique backend name for this recipient's email
|
||||
backend_name = f"email_{recipient_name}"
|
||||
|
||||
# Check if we already have an email backend
|
||||
if not notification_service.has_backend("email"):
|
||||
# Create email adapter with all email recipients
|
||||
email_recipients = []
|
||||
for r in recipients:
|
||||
for m in r.get("methods", []):
|
||||
if m.get("type") == "email" and m.get("address"):
|
||||
email_recipients.append(m.get("address"))
|
||||
|
||||
if email_recipients:
|
||||
email_adapter = EmailNotificationAdapter(
|
||||
email_service, email_recipients
|
||||
)
|
||||
notification_service.register_backend("email", email_adapter)
|
||||
_LOGGER.info(
|
||||
"Registered email backend with %d recipient(s)",
|
||||
len(email_recipients),
|
||||
)
|
||||
|
||||
elif method_type == "pushover":
|
||||
if not pushover_service:
|
||||
_LOGGER.warning(
|
||||
"Pushover method configured for %s but pushover service not available",
|
||||
recipient_name,
|
||||
)
|
||||
continue
|
||||
|
||||
priority = method.get("priority", 0)
|
||||
|
||||
# Check if we already have a pushover backend
|
||||
if not notification_service.has_backend("pushover"):
|
||||
# Pushover sends to user_key configured in pushover service
|
||||
pushover_adapter = PushoverNotificationAdapter(
|
||||
pushover_service, priority
|
||||
)
|
||||
notification_service.register_backend("pushover", pushover_adapter)
|
||||
_LOGGER.info("Registered pushover backend with priority %d", priority)
|
||||
|
||||
if not notification_service.backends:
|
||||
_LOGGER.warning("No notification backends could be configured")
|
||||
return None
|
||||
|
||||
_LOGGER.info(
|
||||
"Notification service configured with backends: %s",
|
||||
list(notification_service.backends.keys()),
|
||||
)
|
||||
return notification_service
|
||||
|
||||
|
||||
def get_enabled_backends(
|
||||
notification_type: str, config: dict[str, Any]
|
||||
) -> list[str] | None:
|
||||
"""Get list of enabled backends for a notification type.
|
||||
|
||||
Args:
|
||||
notification_type: "daily_report" or "error_alerts"
|
||||
config: Full configuration dictionary
|
||||
|
||||
Returns:
|
||||
List of backend names to use, or None for all backends
|
||||
|
||||
"""
|
||||
notifications_config = config.get("notifications", {})
|
||||
notification_config = notifications_config.get(notification_type, {})
|
||||
|
||||
if not notification_config.get("enabled", False):
|
||||
return []
|
||||
|
||||
# Return None to indicate all backends should be used
|
||||
# The NotificationService will send to all registered backends
|
||||
return None
|
||||
|
||||
|
||||
def get_notification_config(
|
||||
notification_type: str, config: dict[str, Any]
|
||||
) -> dict[str, Any]:
|
||||
"""Get configuration for a specific notification type.
|
||||
|
||||
Args:
|
||||
notification_type: "daily_report" or "error_alerts"
|
||||
config: Full configuration dictionary
|
||||
|
||||
Returns:
|
||||
Configuration dictionary for the notification type
|
||||
|
||||
"""
|
||||
notifications_config = config.get("notifications", {})
|
||||
return notifications_config.get(notification_type, {})
|
||||
177
src/alpine_bits_python/notification_service.py
Normal file
177
src/alpine_bits_python/notification_service.py
Normal file
@@ -0,0 +1,177 @@
|
||||
"""Unified notification service supporting multiple backends.
|
||||
|
||||
This module provides a unified interface for sending notifications through
|
||||
different channels (email, Pushover, etc.) for alerts and daily reports.
|
||||
"""
|
||||
|
||||
from typing import Any, Protocol
|
||||
|
||||
from .logging_config import get_logger
|
||||
|
||||
_LOGGER = get_logger(__name__)
|
||||
|
||||
|
||||
class NotificationBackend(Protocol):
|
||||
"""Protocol for notification backends."""
|
||||
|
||||
async def send_alert(self, title: str, message: str, **kwargs) -> bool:
|
||||
"""Send an alert notification.
|
||||
|
||||
Args:
|
||||
title: Alert title/subject
|
||||
message: Alert message/body
|
||||
**kwargs: Backend-specific parameters
|
||||
|
||||
Returns:
|
||||
True if sent successfully, False otherwise
|
||||
|
||||
"""
|
||||
...
|
||||
|
||||
async def send_daily_report(
|
||||
self,
|
||||
stats: dict[str, Any],
|
||||
errors: list[dict[str, Any]] | None = None,
|
||||
**kwargs,
|
||||
) -> bool:
|
||||
"""Send a daily report notification.
|
||||
|
||||
Args:
|
||||
stats: Statistics dictionary
|
||||
errors: Optional list of errors
|
||||
**kwargs: Backend-specific parameters
|
||||
|
||||
Returns:
|
||||
True if sent successfully, False otherwise
|
||||
|
||||
"""
|
||||
...
|
||||
|
||||
|
||||
class NotificationService:
|
||||
"""Unified notification service that supports multiple backends.
|
||||
|
||||
This service can send notifications through multiple channels simultaneously
|
||||
(email, Pushover, etc.) based on configuration.
|
||||
"""
|
||||
|
||||
def __init__(self):
|
||||
"""Initialize the notification service."""
|
||||
self.backends: dict[str, NotificationBackend] = {}
|
||||
|
||||
def register_backend(self, name: str, backend: NotificationBackend) -> None:
|
||||
"""Register a notification backend.
|
||||
|
||||
Args:
|
||||
name: Backend name (e.g., "email", "pushover")
|
||||
backend: Backend instance implementing NotificationBackend protocol
|
||||
|
||||
"""
|
||||
self.backends[name] = backend
|
||||
_LOGGER.info("Registered notification backend: %s", name)
|
||||
|
||||
async def send_alert(
|
||||
self,
|
||||
title: str,
|
||||
message: str,
|
||||
backends: list[str] | None = None,
|
||||
**kwargs,
|
||||
) -> dict[str, bool]:
|
||||
"""Send an alert through specified backends.
|
||||
|
||||
Args:
|
||||
title: Alert title/subject
|
||||
message: Alert message/body
|
||||
backends: List of backend names to use (None = all registered)
|
||||
**kwargs: Backend-specific parameters
|
||||
|
||||
Returns:
|
||||
Dictionary mapping backend names to success status
|
||||
|
||||
"""
|
||||
if backends is None:
|
||||
backends = list(self.backends.keys())
|
||||
|
||||
results = {}
|
||||
for backend_name in backends:
|
||||
backend = self.backends.get(backend_name)
|
||||
if backend is None:
|
||||
_LOGGER.warning("Backend not found: %s", backend_name)
|
||||
results[backend_name] = False
|
||||
continue
|
||||
|
||||
try:
|
||||
success = await backend.send_alert(title, message, **kwargs)
|
||||
results[backend_name] = success
|
||||
except Exception:
|
||||
_LOGGER.exception(
|
||||
"Error sending alert through backend %s", backend_name
|
||||
)
|
||||
results[backend_name] = False
|
||||
|
||||
return results
|
||||
|
||||
async def send_daily_report(
|
||||
self,
|
||||
stats: dict[str, Any],
|
||||
errors: list[dict[str, Any]] | None = None,
|
||||
backends: list[str] | None = None,
|
||||
**kwargs,
|
||||
) -> dict[str, bool]:
|
||||
"""Send a daily report through specified backends.
|
||||
|
||||
Args:
|
||||
stats: Statistics dictionary
|
||||
errors: Optional list of errors
|
||||
backends: List of backend names to use (None = all registered)
|
||||
**kwargs: Backend-specific parameters
|
||||
|
||||
Returns:
|
||||
Dictionary mapping backend names to success status
|
||||
|
||||
"""
|
||||
if backends is None:
|
||||
backends = list(self.backends.keys())
|
||||
|
||||
results = {}
|
||||
for backend_name in backends:
|
||||
backend = self.backends.get(backend_name)
|
||||
if backend is None:
|
||||
_LOGGER.warning("Backend not found: %s", backend_name)
|
||||
results[backend_name] = False
|
||||
continue
|
||||
|
||||
try:
|
||||
success = await backend.send_daily_report(stats, errors, **kwargs)
|
||||
results[backend_name] = success
|
||||
except Exception:
|
||||
_LOGGER.exception(
|
||||
"Error sending daily report through backend %s", backend_name
|
||||
)
|
||||
results[backend_name] = False
|
||||
|
||||
return results
|
||||
|
||||
def get_backend(self, name: str) -> NotificationBackend | None:
|
||||
"""Get a specific notification backend.
|
||||
|
||||
Args:
|
||||
name: Backend name
|
||||
|
||||
Returns:
|
||||
Backend instance or None if not found
|
||||
|
||||
"""
|
||||
return self.backends.get(name)
|
||||
|
||||
def has_backend(self, name: str) -> bool:
|
||||
"""Check if a backend is registered.
|
||||
|
||||
Args:
|
||||
name: Backend name
|
||||
|
||||
Returns:
|
||||
True if backend is registered
|
||||
|
||||
"""
|
||||
return name in self.backends
|
||||
281
src/alpine_bits_python/pushover_service.py
Normal file
281
src/alpine_bits_python/pushover_service.py
Normal file
@@ -0,0 +1,281 @@
|
||||
"""Pushover service for sending push notifications.
|
||||
|
||||
This module provides push notification functionality for the AlpineBits application,
|
||||
including error alerts and daily reports via Pushover.
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
from datetime import datetime
|
||||
from typing import Any
|
||||
|
||||
from pushover_complete import PushoverAPI
|
||||
|
||||
from .logging_config import get_logger
|
||||
|
||||
_LOGGER = get_logger(__name__)
|
||||
|
||||
|
||||
class PushoverConfig:
|
||||
"""Configuration for Pushover service.
|
||||
|
||||
Attributes:
|
||||
user_key: Pushover user/group key
|
||||
api_token: Pushover application API token
|
||||
|
||||
"""
|
||||
|
||||
def __init__(self, config: dict[str, Any]):
|
||||
"""Initialize Pushover configuration from config dict.
|
||||
|
||||
Args:
|
||||
config: Pushover configuration dictionary
|
||||
|
||||
"""
|
||||
self.user_key: str | None = config.get("user_key")
|
||||
self.api_token: str | None = config.get("api_token")
|
||||
|
||||
# Validate configuration
|
||||
if not self.user_key or not self.api_token:
|
||||
msg = "Both user_key and api_token are required for Pushover"
|
||||
raise ValueError(msg)
|
||||
|
||||
|
||||
class PushoverService:
|
||||
"""Service for sending push notifications via Pushover.
|
||||
|
||||
This service handles sending notifications through the Pushover API,
|
||||
including alerts and daily reports.
|
||||
"""
|
||||
|
||||
def __init__(self, config: PushoverConfig):
|
||||
"""Initialize Pushover service.
|
||||
|
||||
Args:
|
||||
config: Pushover configuration
|
||||
|
||||
"""
|
||||
self.config = config
|
||||
self.api = PushoverAPI(config.api_token)
|
||||
|
||||
async def send_notification(
|
||||
self,
|
||||
title: str,
|
||||
message: str,
|
||||
priority: int = 0,
|
||||
url: str | None = None,
|
||||
url_title: str | None = None,
|
||||
) -> bool:
|
||||
"""Send a push notification via Pushover.
|
||||
|
||||
Args:
|
||||
title: Notification title
|
||||
message: Notification message
|
||||
priority: Priority level (-2 to 2, default 0)
|
||||
url: Optional supplementary URL
|
||||
url_title: Optional title for the URL
|
||||
|
||||
Returns:
|
||||
True if notification was sent successfully, False otherwise
|
||||
|
||||
"""
|
||||
try:
|
||||
# Send notification in thread pool (API is blocking)
|
||||
loop = asyncio.get_event_loop()
|
||||
await loop.run_in_executor(
|
||||
None,
|
||||
self._send_pushover,
|
||||
title,
|
||||
message,
|
||||
priority,
|
||||
url,
|
||||
url_title,
|
||||
)
|
||||
|
||||
_LOGGER.info("Pushover notification sent successfully: %s", title)
|
||||
return True
|
||||
|
||||
except Exception:
|
||||
_LOGGER.exception("Failed to send Pushover notification: %s", title)
|
||||
return False
|
||||
|
||||
def _send_pushover(
|
||||
self,
|
||||
title: str,
|
||||
message: str,
|
||||
priority: int,
|
||||
url: str | None,
|
||||
url_title: str | None,
|
||||
) -> None:
|
||||
"""Send notification via Pushover (blocking operation).
|
||||
|
||||
Args:
|
||||
title: Notification title
|
||||
message: Notification message
|
||||
priority: Priority level
|
||||
url: Optional URL
|
||||
url_title: Optional URL title
|
||||
|
||||
Raises:
|
||||
Exception: If notification sending fails
|
||||
|
||||
"""
|
||||
kwargs = {
|
||||
"user": self.config.user_key,
|
||||
"title": title,
|
||||
"message": message,
|
||||
"priority": priority,
|
||||
}
|
||||
|
||||
if url:
|
||||
kwargs["url"] = url
|
||||
if url_title:
|
||||
kwargs["url_title"] = url_title
|
||||
|
||||
self.api.send_message(**kwargs)
|
||||
|
||||
async def send_alert(
|
||||
self,
|
||||
title: str,
|
||||
message: str,
|
||||
priority: int = 1,
|
||||
) -> bool:
|
||||
"""Send an alert notification (convenience method).
|
||||
|
||||
Args:
|
||||
title: Alert title
|
||||
message: Alert message
|
||||
priority: Priority level (default 1 for high priority)
|
||||
|
||||
Returns:
|
||||
True if notification was sent successfully, False otherwise
|
||||
|
||||
"""
|
||||
return await self.send_notification(title, message, priority=priority)
|
||||
|
||||
async def send_daily_report(
|
||||
self,
|
||||
stats: dict[str, Any],
|
||||
errors: list[dict[str, Any]] | None = None,
|
||||
priority: int = 0,
|
||||
) -> bool:
|
||||
"""Send a daily report notification.
|
||||
|
||||
Args:
|
||||
stats: Dictionary containing statistics to include in report
|
||||
errors: Optional list of errors to include
|
||||
priority: Priority level (default 0 for normal)
|
||||
|
||||
Returns:
|
||||
True if notification was sent successfully, False otherwise
|
||||
|
||||
"""
|
||||
date_str = datetime.now().strftime("%Y-%m-%d")
|
||||
title = f"AlpineBits Daily Report - {date_str}"
|
||||
|
||||
# Build message body (Pushover has a 1024 character limit)
|
||||
message = self._build_daily_report_message(date_str, stats, errors)
|
||||
|
||||
return await self.send_notification(title, message, priority=priority)
|
||||
|
||||
def _build_daily_report_message(
|
||||
self,
|
||||
date_str: str,
|
||||
stats: dict[str, Any],
|
||||
errors: list[dict[str, Any]] | None,
|
||||
) -> str:
|
||||
"""Build daily report message for Pushover.
|
||||
|
||||
Args:
|
||||
date_str: Date string for the report
|
||||
stats: Statistics dictionary
|
||||
errors: Optional list of errors
|
||||
|
||||
Returns:
|
||||
Formatted message string (max 1024 chars for Pushover)
|
||||
|
||||
"""
|
||||
lines = [f"Report for {date_str}", ""]
|
||||
|
||||
# Add statistics (simplified for push notification)
|
||||
if stats:
|
||||
# Handle reporting period
|
||||
period = stats.get("reporting_period", {})
|
||||
if period:
|
||||
start = period.get("start", "")
|
||||
end = period.get("end", "")
|
||||
if start and end:
|
||||
# Parse the datetime strings to check if they're on different days
|
||||
if " " in start and " " in end:
|
||||
start_date, start_time = start.split(" ")
|
||||
end_date, end_time = end.split(" ")
|
||||
|
||||
# If same day, just show times
|
||||
if start_date == end_date:
|
||||
lines.append(f"Period: {start_time} - {end_time}")
|
||||
else:
|
||||
# Different days, show date + time in compact format
|
||||
# Format: "MM-DD HH:MM - MM-DD HH:MM"
|
||||
start_compact = f"{start_date[5:]} {start_time[:5]}"
|
||||
end_compact = f"{end_date[5:]} {end_time[:5]}"
|
||||
lines.append(f"Period: {start_compact} - {end_compact}")
|
||||
else:
|
||||
# Fallback if format is unexpected
|
||||
lines.append(f"Period: {start} - {end}")
|
||||
|
||||
# Total reservations
|
||||
total = stats.get("total_reservations", 0)
|
||||
lines.append(f"Total Reservations: {total}")
|
||||
|
||||
# Per-hotel breakdown (top 5 only to save space)
|
||||
hotels = stats.get("hotels", [])
|
||||
if hotels:
|
||||
lines.append("")
|
||||
lines.append("By Hotel:")
|
||||
for hotel in hotels[:5]: # Top 5 hotels
|
||||
hotel_name = hotel.get("hotel_name", "Unknown")
|
||||
count = hotel.get("reservations", 0)
|
||||
# Truncate long hotel names
|
||||
if len(hotel_name) > 20:
|
||||
hotel_name = hotel_name[:17] + "..."
|
||||
lines.append(f" • {hotel_name}: {count}")
|
||||
|
||||
if len(hotels) > 5:
|
||||
lines.append(f" • ... and {len(hotels) - 5} more")
|
||||
|
||||
# Add error summary if present
|
||||
if errors:
|
||||
lines.append("")
|
||||
lines.append(f"Errors: {len(errors)} (see logs)")
|
||||
|
||||
message = "\n".join(lines)
|
||||
|
||||
# Truncate if too long (Pushover limit is 1024 chars)
|
||||
if len(message) > 1020:
|
||||
message = message[:1017] + "..."
|
||||
|
||||
return message
|
||||
|
||||
|
||||
def create_pushover_service(config: dict[str, Any]) -> PushoverService | None:
|
||||
"""Create a Pushover service from configuration.
|
||||
|
||||
Args:
|
||||
config: Full application configuration dictionary
|
||||
|
||||
Returns:
|
||||
PushoverService instance if Pushover is configured, None otherwise
|
||||
|
||||
"""
|
||||
pushover_config = config.get("pushover")
|
||||
if not pushover_config:
|
||||
_LOGGER.info("Pushover not configured, push notification service disabled")
|
||||
return None
|
||||
|
||||
try:
|
||||
pushover_cfg = PushoverConfig(pushover_config)
|
||||
service = PushoverService(pushover_cfg)
|
||||
_LOGGER.info("Pushover service initialized successfully")
|
||||
return service
|
||||
except Exception:
|
||||
_LOGGER.exception("Failed to initialize Pushover service")
|
||||
return None
|
||||
289
src/alpine_bits_python/reservation_service.py
Normal file
289
src/alpine_bits_python/reservation_service.py
Normal file
@@ -0,0 +1,289 @@
|
||||
"""Reservation service layer for handling reservation database operations."""
|
||||
|
||||
import hashlib
|
||||
from datetime import UTC, datetime
|
||||
from typing import Optional
|
||||
|
||||
from sqlalchemy import and_, select
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
|
||||
from .db import AckedRequest, Customer, Reservation
|
||||
from .schemas import ReservationData
|
||||
|
||||
|
||||
class ReservationService:
|
||||
"""Service for managing reservations and related operations.
|
||||
|
||||
Handles all database operations for reservations including creation,
|
||||
retrieval, and acknowledgement tracking.
|
||||
"""
|
||||
|
||||
def __init__(self, session: AsyncSession):
|
||||
self.session = session
|
||||
|
||||
def _convert_reservation_data_to_db(
|
||||
self, reservation_model: ReservationData, customer_id: int
|
||||
) -> Reservation:
|
||||
"""Convert ReservationData to Reservation model.
|
||||
|
||||
Args:
|
||||
reservation_model: ReservationData instance
|
||||
customer_id: Customer ID to link to
|
||||
|
||||
Returns:
|
||||
Reservation instance ready for database insertion
|
||||
"""
|
||||
data = reservation_model.model_dump(exclude_none=True)
|
||||
|
||||
# Convert children_ages list to CSV string
|
||||
children_list = data.pop("children_ages", [])
|
||||
children_csv = (
|
||||
",".join(str(int(a)) for a in children_list) if children_list else ""
|
||||
)
|
||||
data["children_ages"] = children_csv
|
||||
|
||||
# Inject foreign key
|
||||
data["customer_id"] = customer_id
|
||||
|
||||
return Reservation(**data)
|
||||
|
||||
async def create_reservation(
|
||||
self, reservation_data: ReservationData, customer_id: int
|
||||
) -> Reservation:
|
||||
"""Create a new reservation.
|
||||
|
||||
Args:
|
||||
reservation_data: ReservationData containing reservation details
|
||||
customer_id: ID of the customer making the reservation
|
||||
|
||||
Returns:
|
||||
Created Reservation instance
|
||||
"""
|
||||
reservation = self._convert_reservation_data_to_db(
|
||||
reservation_data, customer_id
|
||||
)
|
||||
self.session.add(reservation)
|
||||
await self.session.commit()
|
||||
await self.session.refresh(reservation)
|
||||
return reservation
|
||||
|
||||
async def get_reservation_by_unique_id(
|
||||
self, unique_id: str
|
||||
) -> Optional[Reservation]:
|
||||
"""Get a reservation by unique_id.
|
||||
|
||||
Args:
|
||||
unique_id: The unique_id to search for
|
||||
|
||||
Returns:
|
||||
Reservation instance if found, None otherwise
|
||||
"""
|
||||
result = await self.session.execute(
|
||||
select(Reservation).where(Reservation.unique_id == unique_id)
|
||||
)
|
||||
return result.scalar_one_or_none()
|
||||
|
||||
async def get_reservation_by_md5_unique_id(
|
||||
self, md5_unique_id: str
|
||||
) -> Optional[Reservation]:
|
||||
"""Get a reservation by md5_unique_id.
|
||||
|
||||
Args:
|
||||
md5_unique_id: The MD5 hash of unique_id
|
||||
|
||||
Returns:
|
||||
Reservation instance if found, None otherwise
|
||||
"""
|
||||
result = await self.session.execute(
|
||||
select(Reservation).where(
|
||||
Reservation.md5_unique_id == md5_unique_id
|
||||
)
|
||||
)
|
||||
return result.scalar_one_or_none()
|
||||
|
||||
async def check_duplicate_reservation(
|
||||
self, unique_id: str, md5_unique_id: str
|
||||
) -> bool:
|
||||
"""Check if a reservation already exists.
|
||||
|
||||
Args:
|
||||
unique_id: The unique_id to check
|
||||
md5_unique_id: The MD5 hash to check
|
||||
|
||||
Returns:
|
||||
True if reservation exists, False otherwise
|
||||
"""
|
||||
existing = await self.get_reservation_by_unique_id(unique_id)
|
||||
if existing:
|
||||
return True
|
||||
|
||||
existing_md5 = await self.get_reservation_by_md5_unique_id(md5_unique_id)
|
||||
return existing_md5 is not None
|
||||
|
||||
async def get_reservations_for_customer(
|
||||
self, customer_id: int
|
||||
) -> list[Reservation]:
|
||||
"""Get all reservations for a customer.
|
||||
|
||||
Args:
|
||||
customer_id: The customer ID
|
||||
|
||||
Returns:
|
||||
List of Reservation instances
|
||||
"""
|
||||
result = await self.session.execute(
|
||||
select(Reservation).where(Reservation.customer_id == customer_id)
|
||||
)
|
||||
return list(result.scalars().all())
|
||||
|
||||
async def get_reservations_with_filters(
|
||||
self,
|
||||
start_date: Optional[datetime] = None,
|
||||
end_date: Optional[datetime] = None,
|
||||
hotel_code: Optional[str] = None,
|
||||
) -> list[tuple[Reservation, Customer]]:
|
||||
"""Get reservations with optional filters, joined with customers.
|
||||
|
||||
Args:
|
||||
start_date: Filter by created_at >= this value
|
||||
end_date: Filter by created_at <= this value
|
||||
hotel_code: Filter by hotel code
|
||||
|
||||
Returns:
|
||||
List of (Reservation, Customer) tuples
|
||||
"""
|
||||
query = select(Reservation, Customer).join(
|
||||
Customer, Reservation.customer_id == Customer.id
|
||||
)
|
||||
|
||||
filters = []
|
||||
if start_date:
|
||||
filters.append(Reservation.created_at >= start_date)
|
||||
if end_date:
|
||||
filters.append(Reservation.created_at <= end_date)
|
||||
if hotel_code:
|
||||
filters.append(Reservation.hotel_code == hotel_code)
|
||||
|
||||
if filters:
|
||||
query = query.where(and_(*filters))
|
||||
|
||||
result = await self.session.execute(query)
|
||||
return list(result.all())
|
||||
|
||||
async def get_unacknowledged_reservations(
|
||||
self,
|
||||
start_date: Optional[datetime] = None,
|
||||
end_date: Optional[datetime] = None,
|
||||
hotel_code: Optional[str] = None,
|
||||
username: Optional[str] = None,
|
||||
client_id: Optional[str] = None,
|
||||
) -> list[tuple[Reservation, Customer]]:
|
||||
"""Get reservations that haven't been acknowledged by a client.
|
||||
|
||||
Prioritizes checking by username if provided, falls back to client_id for backward compatibility.
|
||||
|
||||
Args:
|
||||
start_date: Filter by start date >= this value
|
||||
end_date: Filter by end date <= this value
|
||||
hotel_code: Filter by hotel code
|
||||
username: The username of the client (preferred for lookup)
|
||||
client_id: The client ID (fallback for backward compatibility)
|
||||
|
||||
Returns:
|
||||
List of (Reservation, Customer) tuples that are unacknowledged
|
||||
"""
|
||||
# Get all acknowledged unique_ids for this client/username
|
||||
if username:
|
||||
acked_result = await self.session.execute(
|
||||
select(AckedRequest.unique_id).where(
|
||||
AckedRequest.username == username
|
||||
)
|
||||
)
|
||||
else:
|
||||
acked_result = await self.session.execute(
|
||||
select(AckedRequest.unique_id).where(
|
||||
AckedRequest.client_id == client_id
|
||||
)
|
||||
)
|
||||
acked_md5_ids = {row[0] for row in acked_result.all()}
|
||||
|
||||
# Get all reservations with filters
|
||||
all_reservations = await self.get_reservations_with_filters(
|
||||
start_date, end_date, hotel_code
|
||||
)
|
||||
|
||||
# Filter out acknowledged ones (comparing md5_unique_id)
|
||||
return [
|
||||
(res, cust)
|
||||
for res, cust in all_reservations
|
||||
if res.md5_unique_id not in acked_md5_ids
|
||||
]
|
||||
|
||||
async def record_acknowledgement(
|
||||
self, client_id: str, unique_id: str, username: Optional[str] = None
|
||||
) -> AckedRequest:
|
||||
"""Record that a client has acknowledged a reservation.
|
||||
|
||||
Args:
|
||||
client_id: The client ID
|
||||
unique_id: The unique_id of the reservation (md5_unique_id)
|
||||
username: The username of the client making the request (optional)
|
||||
|
||||
Returns:
|
||||
Created AckedRequest instance
|
||||
"""
|
||||
acked = AckedRequest(
|
||||
client_id=client_id,
|
||||
username=username,
|
||||
unique_id=unique_id,
|
||||
timestamp=datetime.now(UTC),
|
||||
)
|
||||
self.session.add(acked)
|
||||
await self.session.commit()
|
||||
await self.session.refresh(acked)
|
||||
return acked
|
||||
|
||||
async def is_acknowledged(self, unique_id: str, username: Optional[str] = None, client_id: Optional[str] = None) -> bool:
|
||||
"""Check if a reservation has been acknowledged by a client.
|
||||
|
||||
Prioritizes checking by username if provided, falls back to client_id for backward compatibility.
|
||||
|
||||
Args:
|
||||
unique_id: The reservation unique_id
|
||||
username: The username of the client (preferred for lookup)
|
||||
client_id: The client ID (fallback for backward compatibility)
|
||||
|
||||
Returns:
|
||||
True if acknowledged, False otherwise
|
||||
"""
|
||||
if username:
|
||||
result = await self.session.execute(
|
||||
select(AckedRequest).where(
|
||||
and_(
|
||||
AckedRequest.username == username,
|
||||
AckedRequest.unique_id == unique_id,
|
||||
)
|
||||
)
|
||||
)
|
||||
else:
|
||||
result = await self.session.execute(
|
||||
select(AckedRequest).where(
|
||||
and_(
|
||||
AckedRequest.client_id == client_id,
|
||||
AckedRequest.unique_id == unique_id,
|
||||
)
|
||||
)
|
||||
)
|
||||
return result.scalar_one_or_none() is not None
|
||||
|
||||
@staticmethod
|
||||
def generate_md5_unique_id(unique_id: str) -> str:
|
||||
"""Generate MD5 hash of unique_id.
|
||||
|
||||
Args:
|
||||
unique_id: The unique_id to hash
|
||||
|
||||
Returns:
|
||||
MD5 hash as hex string
|
||||
"""
|
||||
return hashlib.md5(unique_id.encode("utf-8")).hexdigest()
|
||||
@@ -6,9 +6,9 @@ import os
|
||||
import uvicorn
|
||||
|
||||
if __name__ == "__main__":
|
||||
db_path = "alpinebits.db" # Adjust path if needed
|
||||
if os.path.exists(db_path):
|
||||
os.remove(db_path)
|
||||
# db_path = "alpinebits.db" # Adjust path if needed
|
||||
# if os.path.exists(db_path):
|
||||
# os.remove(db_path)
|
||||
|
||||
uvicorn.run(
|
||||
"alpine_bits_python.api:app",
|
||||
|
||||
@@ -16,6 +16,57 @@ from enum import Enum
|
||||
from pydantic import BaseModel, EmailStr, Field, field_validator, model_validator
|
||||
|
||||
|
||||
# Country name to ISO 3166-1 alpha-2 code mapping
|
||||
COUNTRY_NAME_TO_CODE = {
|
||||
# English names
|
||||
"germany": "DE",
|
||||
"italy": "IT",
|
||||
"austria": "AT",
|
||||
"switzerland": "CH",
|
||||
"france": "FR",
|
||||
"netherlands": "NL",
|
||||
"belgium": "BE",
|
||||
"spain": "ES",
|
||||
"portugal": "PT",
|
||||
"united kingdom": "GB",
|
||||
"uk": "GB",
|
||||
"czech republic": "CZ",
|
||||
"poland": "PL",
|
||||
"hungary": "HU",
|
||||
"croatia": "HR",
|
||||
"slovenia": "SI",
|
||||
# German names
|
||||
"deutschland": "DE",
|
||||
"italien": "IT",
|
||||
"österreich": "AT",
|
||||
"schweiz": "CH",
|
||||
"frankreich": "FR",
|
||||
"niederlande": "NL",
|
||||
"belgien": "BE",
|
||||
"spanien": "ES",
|
||||
"vereinigtes königreich": "GB",
|
||||
"tschechien": "CZ",
|
||||
"polen": "PL",
|
||||
"ungarn": "HU",
|
||||
"kroatien": "HR",
|
||||
"slowenien": "SI",
|
||||
# Italian names
|
||||
"germania": "DE",
|
||||
"italia": "IT",
|
||||
"svizzera": "CH",
|
||||
"francia": "FR",
|
||||
"paesi bassi": "NL",
|
||||
"belgio": "BE",
|
||||
"spagna": "ES",
|
||||
"portogallo": "PT",
|
||||
"regno unito": "GB",
|
||||
"repubblica ceca": "CZ",
|
||||
"polonia": "PL",
|
||||
"ungheria": "HU",
|
||||
"croazia": "HR",
|
||||
}
|
||||
|
||||
|
||||
# phonetechtype enum 1,3,5 voice, fax, mobile
|
||||
class PhoneTechType(Enum):
|
||||
VOICE = "1"
|
||||
@@ -53,11 +104,18 @@ class ReservationData(BaseModel):
|
||||
user_comment: str | None = Field(None, max_length=2000)
|
||||
fbclid: str | None = Field(None, max_length=300)
|
||||
gclid: str | None = Field(None, max_length=300)
|
||||
# Advertising account IDs (populated conditionally based on fbclid/gclid)
|
||||
meta_account_id: str | None = Field(None, max_length=200)
|
||||
google_account_id: str | None = Field(None, max_length=200)
|
||||
utm_source: str | None = Field(None, max_length=150)
|
||||
utm_medium: str | None = Field(None, max_length=150)
|
||||
utm_campaign: str | None = Field(None, max_length=150)
|
||||
utm_term: str | None = Field(None, max_length=150)
|
||||
utm_content: str | None = Field(None, max_length=150)
|
||||
# RoomTypes fields (optional)
|
||||
room_type_code: str | None = Field(None, min_length=1, max_length=8)
|
||||
room_classification_code: str | None = Field(None, pattern=r"[0-9]+")
|
||||
room_type: str | None = Field(None, pattern=r"^[1-5]$")
|
||||
|
||||
@model_validator(mode="after")
|
||||
def ensure_md5(self) -> "ReservationData":
|
||||
@@ -99,9 +157,7 @@ class CustomerData(BaseModel):
|
||||
address_line: str | None = Field(None, max_length=255)
|
||||
city_name: str | None = Field(None, max_length=100)
|
||||
postal_code: str | None = Field(None, max_length=20)
|
||||
country_code: str | None = Field(
|
||||
None, min_length=2, max_length=2, pattern="^[A-Z]{2}$"
|
||||
)
|
||||
country_code: str | None = Field(None, min_length=2, max_length=2)
|
||||
address_catalog: bool | None = None
|
||||
gender: str | None = Field(None, pattern="^(Male|Female|Unknown)$")
|
||||
birth_date: str | None = Field(None, pattern=r"^\d{4}-\d{2}-\d{2}$") # ISO format
|
||||
@@ -115,11 +171,44 @@ class CustomerData(BaseModel):
|
||||
raise ValueError("Name cannot be empty or whitespace")
|
||||
return v.strip()
|
||||
|
||||
@field_validator("country_code")
|
||||
@field_validator("country_code", mode="before")
|
||||
@classmethod
|
||||
def normalize_country_code(cls, v: str | None) -> str | None:
|
||||
"""Normalize country code to uppercase."""
|
||||
return v.upper() if v else None
|
||||
"""Normalize country input to ISO 3166-1 alpha-2 code.
|
||||
|
||||
Handles:
|
||||
- Country names in English, German, and Italian
|
||||
- Already valid 2-letter codes (case-insensitive)
|
||||
- None/empty values
|
||||
|
||||
Runs in 'before' mode to normalize before other validations.
|
||||
This ensures that old data saved incorrectly in the database is
|
||||
transformed into the correct format when retrieved, and that new
|
||||
data is always normalized regardless of the source.
|
||||
|
||||
Args:
|
||||
v: Country name or code (case-insensitive)
|
||||
|
||||
Returns:
|
||||
2-letter ISO country code (uppercase) or None if input is None/empty
|
||||
"""
|
||||
if not v:
|
||||
return None
|
||||
|
||||
# Convert to string and strip whitespace
|
||||
country_input = str(v).strip()
|
||||
|
||||
if not country_input:
|
||||
return None
|
||||
|
||||
# If already 2 letters, assume it's a country code (ISO 3166-1 alpha-2)
|
||||
iso_country_code_length = 2
|
||||
if len(country_input) == iso_country_code_length and country_input.isalpha():
|
||||
return country_input.upper()
|
||||
|
||||
# Try to match as country name (case-insensitive)
|
||||
country_lower = country_input.lower()
|
||||
return COUNTRY_NAME_TO_CODE.get(country_lower, country_input)
|
||||
|
||||
@field_validator("language")
|
||||
@classmethod
|
||||
|
||||
390
src/alpine_bits_python/unified_monitoring.py
Normal file
390
src/alpine_bits_python/unified_monitoring.py
Normal file
@@ -0,0 +1,390 @@
|
||||
"""Unified monitoring with support for multiple notification backends.
|
||||
|
||||
This module provides alert handlers and schedulers that work with the
|
||||
unified notification service to send alerts through multiple channels.
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
import logging
|
||||
import threading
|
||||
from collections import deque
|
||||
from datetime import datetime, timedelta
|
||||
from typing import Any
|
||||
|
||||
from .email_monitoring import ErrorRecord, ReservationStatsCollector
|
||||
from .logging_config import get_logger
|
||||
from .notification_service import NotificationService
|
||||
|
||||
_LOGGER = get_logger(__name__)
|
||||
|
||||
|
||||
class UnifiedAlertHandler(logging.Handler):
|
||||
"""Custom logging handler that sends alerts through unified notification service.
|
||||
|
||||
This handler uses a hybrid approach:
|
||||
- Accumulates errors in a buffer
|
||||
- Sends immediately if error threshold is reached
|
||||
- Otherwise sends after buffer duration expires
|
||||
- Always sends buffered errors (no minimum threshold for time-based flush)
|
||||
- Implements cooldown to prevent alert spam
|
||||
|
||||
The handler is thread-safe and works with asyncio event loops.
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
notification_service: NotificationService,
|
||||
config: dict[str, Any],
|
||||
loop: asyncio.AbstractEventLoop | None = None,
|
||||
):
|
||||
"""Initialize the unified alert handler.
|
||||
|
||||
Args:
|
||||
notification_service: Unified notification service
|
||||
config: Configuration dictionary for error alerts
|
||||
loop: Asyncio event loop (will use current loop if not provided)
|
||||
|
||||
"""
|
||||
super().__init__()
|
||||
self.notification_service = notification_service
|
||||
self.config = config
|
||||
self.loop = loop # Will be set when first error occurs if not provided
|
||||
|
||||
# Configuration
|
||||
self.error_threshold = config.get("error_threshold", 5)
|
||||
self.buffer_minutes = config.get("buffer_minutes", 15)
|
||||
self.cooldown_minutes = config.get("cooldown_minutes", 15)
|
||||
self.log_levels = config.get("log_levels", ["ERROR", "CRITICAL"])
|
||||
|
||||
# State
|
||||
self.error_buffer: deque[ErrorRecord] = deque()
|
||||
self.last_sent = datetime.min # Last time we sent an alert
|
||||
self._flush_task: asyncio.Task | None = None
|
||||
self._lock = threading.Lock() # Thread-safe for multi-threaded logging
|
||||
|
||||
_LOGGER.info(
|
||||
"UnifiedAlertHandler initialized: threshold=%d, buffer=%dmin, cooldown=%dmin",
|
||||
self.error_threshold,
|
||||
self.buffer_minutes,
|
||||
self.cooldown_minutes,
|
||||
)
|
||||
|
||||
def emit(self, record: logging.LogRecord) -> None:
|
||||
"""Handle a log record.
|
||||
|
||||
This is called automatically by the logging system when an error is logged.
|
||||
It's important that this method is fast and doesn't block.
|
||||
|
||||
Args:
|
||||
record: The log record to handle
|
||||
|
||||
"""
|
||||
# Only handle configured log levels
|
||||
if record.levelname not in self.log_levels:
|
||||
return
|
||||
|
||||
try:
|
||||
# Ensure we have an event loop
|
||||
if self.loop is None:
|
||||
try:
|
||||
self.loop = asyncio.get_running_loop()
|
||||
except RuntimeError:
|
||||
# No running loop, we'll need to handle this differently
|
||||
_LOGGER.warning("No asyncio event loop available for alerts")
|
||||
return
|
||||
|
||||
# Add error to buffer (thread-safe)
|
||||
with self._lock:
|
||||
error_record = ErrorRecord(record)
|
||||
self.error_buffer.append(error_record)
|
||||
buffer_size = len(self.error_buffer)
|
||||
|
||||
# Determine if we should send immediately
|
||||
should_send_immediately = buffer_size >= self.error_threshold
|
||||
|
||||
if should_send_immediately:
|
||||
# Cancel any pending flush task
|
||||
if self._flush_task and not self._flush_task.done():
|
||||
self._flush_task.cancel()
|
||||
|
||||
# Schedule immediate flush
|
||||
self._flush_task = asyncio.run_coroutine_threadsafe(
|
||||
self._flush_buffer(immediate=True),
|
||||
self.loop,
|
||||
)
|
||||
# Schedule delayed flush if not already scheduled
|
||||
elif not self._flush_task or self._flush_task.done():
|
||||
self._flush_task = asyncio.run_coroutine_threadsafe(
|
||||
self._schedule_delayed_flush(),
|
||||
self.loop,
|
||||
)
|
||||
|
||||
except Exception:
|
||||
# Never let the handler crash - just log and continue
|
||||
_LOGGER.exception("Error in UnifiedAlertHandler.emit")
|
||||
|
||||
async def _schedule_delayed_flush(self) -> None:
|
||||
"""Schedule a delayed buffer flush after buffer duration."""
|
||||
await asyncio.sleep(self.buffer_minutes * 60)
|
||||
await self._flush_buffer(immediate=False)
|
||||
|
||||
async def _flush_buffer(self, *, immediate: bool) -> None:
|
||||
"""Flush the error buffer and send alert.
|
||||
|
||||
Args:
|
||||
immediate: Whether this is an immediate flush (threshold hit)
|
||||
|
||||
"""
|
||||
# Check cooldown period
|
||||
now = datetime.now()
|
||||
time_since_last = (now - self.last_sent).total_seconds() / 60
|
||||
|
||||
if time_since_last < self.cooldown_minutes:
|
||||
_LOGGER.info(
|
||||
"Alert cooldown active (%.1f min remaining), buffering errors",
|
||||
self.cooldown_minutes - time_since_last,
|
||||
)
|
||||
# Don't clear buffer - let errors accumulate until cooldown expires
|
||||
return
|
||||
|
||||
# Get all buffered errors (thread-safe)
|
||||
with self._lock:
|
||||
if not self.error_buffer:
|
||||
return
|
||||
|
||||
errors = list(self.error_buffer)
|
||||
self.error_buffer.clear()
|
||||
|
||||
# Update last sent time
|
||||
self.last_sent = now
|
||||
|
||||
# Format alert
|
||||
error_count = len(errors)
|
||||
time_range = (
|
||||
f"{errors[0].timestamp.strftime('%H:%M:%S')} to "
|
||||
f"{errors[-1].timestamp.strftime('%H:%M:%S')}"
|
||||
)
|
||||
|
||||
# Determine alert type
|
||||
alert_type = "Immediate Alert" if immediate else "Scheduled Alert"
|
||||
if immediate:
|
||||
reason = f"(threshold of {self.error_threshold} exceeded)"
|
||||
else:
|
||||
reason = f"({self.buffer_minutes} minute buffer)"
|
||||
|
||||
title = f"AlpineBits Error {alert_type}: {error_count} errors {reason}"
|
||||
|
||||
# Build message
|
||||
message = f"Error Alert - {datetime.now().strftime('%Y-%m-%d %H:%M:%S')}\n"
|
||||
message += "=" * 70 + "\n\n"
|
||||
message += f"Alert Type: {alert_type}\n"
|
||||
message += f"Error Count: {error_count}\n"
|
||||
message += f"Time Range: {time_range}\n"
|
||||
message += f"Reason: {reason}\n"
|
||||
message += "\n" + "=" * 70 + "\n\n"
|
||||
|
||||
# Add individual errors
|
||||
message += "Errors:\n"
|
||||
message += "-" * 70 + "\n\n"
|
||||
for error in errors:
|
||||
message += error.format_plain_text()
|
||||
message += "\n"
|
||||
|
||||
message += "-" * 70 + "\n"
|
||||
message += f"Generated by AlpineBits Monitoring at {now.strftime('%Y-%m-%d %H:%M:%S')}\n"
|
||||
|
||||
# Send through unified notification service
|
||||
try:
|
||||
results = await self.notification_service.send_alert(
|
||||
title=title,
|
||||
message=message,
|
||||
backends=None, # Send to all backends
|
||||
)
|
||||
|
||||
success_count = sum(1 for success in results.values() if success)
|
||||
if success_count > 0:
|
||||
_LOGGER.info(
|
||||
"Alert sent successfully through %d/%d backend(s): %d errors",
|
||||
success_count,
|
||||
len(results),
|
||||
error_count,
|
||||
)
|
||||
else:
|
||||
_LOGGER.error("Failed to send alert through any backend: %d errors", error_count)
|
||||
|
||||
except Exception:
|
||||
_LOGGER.exception("Exception while sending alert")
|
||||
|
||||
def close(self) -> None:
|
||||
"""Close the handler and flush any remaining errors.
|
||||
|
||||
This is called when the logging system shuts down.
|
||||
"""
|
||||
# Cancel any pending flush tasks
|
||||
if self._flush_task and not self._flush_task.done():
|
||||
self._flush_task.cancel()
|
||||
|
||||
# Flush any remaining errors immediately
|
||||
if self.error_buffer and self.loop:
|
||||
try:
|
||||
# Check if the loop is still running
|
||||
if not self.loop.is_closed():
|
||||
future = asyncio.run_coroutine_threadsafe(
|
||||
self._flush_buffer(immediate=False),
|
||||
self.loop,
|
||||
)
|
||||
future.result(timeout=5)
|
||||
else:
|
||||
_LOGGER.warning(
|
||||
"Event loop closed, cannot flush %d remaining errors",
|
||||
len(self.error_buffer),
|
||||
)
|
||||
except Exception:
|
||||
_LOGGER.exception("Error flushing buffer on close")
|
||||
|
||||
super().close()
|
||||
|
||||
|
||||
class UnifiedDailyReportScheduler:
|
||||
"""Scheduler for sending daily reports through unified notification service.
|
||||
|
||||
This runs as a background task and sends daily reports containing
|
||||
statistics and error summaries through all configured notification backends.
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
notification_service: NotificationService,
|
||||
config: dict[str, Any],
|
||||
):
|
||||
"""Initialize the unified daily report scheduler.
|
||||
|
||||
Args:
|
||||
notification_service: Unified notification service
|
||||
config: Configuration for daily reports
|
||||
|
||||
"""
|
||||
self.notification_service = notification_service
|
||||
self.config = config
|
||||
self.send_time = config.get("send_time", "08:00") # Default 8 AM
|
||||
self.include_stats = config.get("include_stats", True)
|
||||
self.include_errors = config.get("include_errors", True)
|
||||
|
||||
self._task: asyncio.Task | None = None
|
||||
self._stats_collector = None # Will be set by application
|
||||
self._error_log: list[dict[str, Any]] = []
|
||||
|
||||
_LOGGER.info(
|
||||
"UnifiedDailyReportScheduler initialized: send_time=%s",
|
||||
self.send_time,
|
||||
)
|
||||
|
||||
def start(self) -> None:
|
||||
"""Start the daily report scheduler."""
|
||||
if self._task is None or self._task.done():
|
||||
self._task = asyncio.create_task(self._run())
|
||||
_LOGGER.info("Daily report scheduler started")
|
||||
|
||||
def stop(self) -> None:
|
||||
"""Stop the daily report scheduler."""
|
||||
if self._task and not self._task.done():
|
||||
self._task.cancel()
|
||||
_LOGGER.info("Daily report scheduler stopped")
|
||||
|
||||
def log_error(self, error: dict[str, Any]) -> None:
|
||||
"""Log an error for inclusion in daily report.
|
||||
|
||||
Args:
|
||||
error: Error information dictionary
|
||||
|
||||
"""
|
||||
self._error_log.append(error)
|
||||
|
||||
async def _run(self) -> None:
|
||||
"""Run the daily report scheduler loop."""
|
||||
while True:
|
||||
try:
|
||||
# Calculate time until next report
|
||||
now = datetime.now()
|
||||
target_hour, target_minute = map(int, self.send_time.split(":"))
|
||||
|
||||
# Calculate next send time
|
||||
next_send = now.replace(
|
||||
hour=target_hour,
|
||||
minute=target_minute,
|
||||
second=0,
|
||||
microsecond=0,
|
||||
)
|
||||
|
||||
# If time has passed today, schedule for tomorrow
|
||||
if next_send <= now:
|
||||
next_send += timedelta(days=1)
|
||||
|
||||
# Calculate sleep duration
|
||||
sleep_seconds = (next_send - now).total_seconds()
|
||||
|
||||
_LOGGER.info(
|
||||
"Next daily report scheduled for %s (in %.1f hours)",
|
||||
next_send.strftime("%Y-%m-%d %H:%M:%S"),
|
||||
sleep_seconds / 3600,
|
||||
)
|
||||
|
||||
# Wait until send time
|
||||
await asyncio.sleep(sleep_seconds)
|
||||
|
||||
# Send report
|
||||
await self._send_report()
|
||||
|
||||
except asyncio.CancelledError:
|
||||
_LOGGER.info("Daily report scheduler cancelled")
|
||||
break
|
||||
except Exception:
|
||||
_LOGGER.exception("Error in daily report scheduler")
|
||||
# Sleep a bit before retrying
|
||||
await asyncio.sleep(60)
|
||||
|
||||
async def _send_report(self) -> None:
|
||||
"""Send the daily report."""
|
||||
stats = {}
|
||||
|
||||
# Collect statistics if enabled
|
||||
if self.include_stats and self._stats_collector:
|
||||
try:
|
||||
stats = await self._stats_collector()
|
||||
except Exception:
|
||||
_LOGGER.exception("Error collecting statistics for daily report")
|
||||
|
||||
# Get errors if enabled
|
||||
errors = self._error_log.copy() if self.include_errors else None
|
||||
|
||||
# Send report through unified notification service
|
||||
try:
|
||||
results = await self.notification_service.send_daily_report(
|
||||
stats=stats,
|
||||
errors=errors,
|
||||
backends=None, # Send to all backends
|
||||
)
|
||||
|
||||
success_count = sum(1 for success in results.values() if success)
|
||||
if success_count > 0:
|
||||
_LOGGER.info(
|
||||
"Daily report sent successfully through %d/%d backend(s)",
|
||||
success_count,
|
||||
len(results),
|
||||
)
|
||||
# Clear error log after successful send
|
||||
self._error_log.clear()
|
||||
else:
|
||||
_LOGGER.error("Failed to send daily report through any backend")
|
||||
|
||||
except Exception:
|
||||
_LOGGER.exception("Exception while sending daily report")
|
||||
|
||||
def set_stats_collector(self, collector) -> None:
|
||||
"""Set the statistics collector function.
|
||||
|
||||
Args:
|
||||
collector: Async function that returns statistics dictionary
|
||||
|
||||
"""
|
||||
self._stats_collector = collector
|
||||
274
src/alpine_bits_python/util/fix_postgres_sequences.py
Normal file
274
src/alpine_bits_python/util/fix_postgres_sequences.py
Normal file
@@ -0,0 +1,274 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Fix PostgreSQL sequences and migrate datetime columns after SQLite migration.
|
||||
|
||||
This script performs two operations:
|
||||
1. Migrates DateTime columns to TIMESTAMP WITH TIME ZONE for timezone-aware support
|
||||
2. Resets all ID sequence values to match the current maximum ID in each table
|
||||
|
||||
The sequence reset is necessary because the migration script inserts records
|
||||
with explicit IDs, which doesn't automatically advance PostgreSQL sequences.
|
||||
|
||||
The datetime migration ensures proper handling of timezone-aware datetimes,
|
||||
which is required by the application code.
|
||||
|
||||
Schema Support:
|
||||
The script automatically detects and uses the schema configured in your config file.
|
||||
If you have database.schema: "alpinebits" in your config, it will work with that schema.
|
||||
|
||||
Usage:
|
||||
# Using default config.yaml (includes schema if configured)
|
||||
uv run python -m alpine_bits_python.util.fix_postgres_sequences
|
||||
|
||||
# Using a specific config file (with schema support)
|
||||
uv run python -m alpine_bits_python.util.fix_postgres_sequences \
|
||||
--config config/postgres.yaml
|
||||
|
||||
# Using DATABASE_URL environment variable (schema from config or DATABASE_SCHEMA env var)
|
||||
DATABASE_URL="postgresql+asyncpg://user:pass@host/db" \
|
||||
DATABASE_SCHEMA="alpinebits" \
|
||||
uv run python -m alpine_bits_python.util.fix_postgres_sequences
|
||||
|
||||
# Using command line argument (schema from config)
|
||||
uv run python -m alpine_bits_python.util.fix_postgres_sequences \
|
||||
--database-url postgresql+asyncpg://user:pass@host/db
|
||||
"""
|
||||
|
||||
import argparse
|
||||
import asyncio
|
||||
import os
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
# Add parent directory to path so we can import alpine_bits_python
|
||||
sys.path.insert(0, str(Path(__file__).parent.parent.parent))
|
||||
|
||||
import yaml
|
||||
from sqlalchemy import text
|
||||
from sqlalchemy.ext.asyncio import async_sessionmaker, create_async_engine
|
||||
|
||||
from alpine_bits_python.db import get_database_schema, get_database_url
|
||||
from alpine_bits_python.logging_config import get_logger
|
||||
|
||||
_LOGGER = get_logger(__name__)
|
||||
|
||||
|
||||
async def migrate_datetime_columns(session, schema_prefix: str = "") -> None:
|
||||
"""Migrate DateTime columns to TIMESTAMP WITH TIME ZONE.
|
||||
|
||||
This updates the columns to properly handle timezone-aware datetimes.
|
||||
|
||||
Args:
|
||||
session: Database session
|
||||
schema_prefix: Schema prefix (e.g., "alpinebits." or "")
|
||||
|
||||
"""
|
||||
_LOGGER.info("\nMigrating DateTime columns to timezone-aware...")
|
||||
|
||||
datetime_columns = [
|
||||
("hashed_customers", "created_at"),
|
||||
("reservations", "created_at"),
|
||||
("acked_requests", "timestamp"),
|
||||
]
|
||||
|
||||
for table_name, column_name in datetime_columns:
|
||||
full_table = f"{schema_prefix}{table_name}"
|
||||
_LOGGER.info(f" {full_table}.{column_name}: Converting to TIMESTAMPTZ")
|
||||
await session.execute(
|
||||
text(
|
||||
f"ALTER TABLE {full_table} "
|
||||
f"ALTER COLUMN {column_name} TYPE TIMESTAMP WITH TIME ZONE"
|
||||
)
|
||||
)
|
||||
|
||||
await session.commit()
|
||||
_LOGGER.info("✓ DateTime columns migrated to timezone-aware")
|
||||
|
||||
|
||||
async def fix_sequences(database_url: str, schema_name: str = None) -> None:
|
||||
"""Fix PostgreSQL sequences to match current max IDs and migrate datetime columns.
|
||||
|
||||
Args:
|
||||
database_url: PostgreSQL database URL
|
||||
schema_name: Schema name (e.g., "alpinebits") or None for public
|
||||
|
||||
"""
|
||||
_LOGGER.info("=" * 70)
|
||||
_LOGGER.info("PostgreSQL Migration & Sequence Fix")
|
||||
_LOGGER.info("=" * 70)
|
||||
_LOGGER.info(
|
||||
"Database: %s",
|
||||
database_url.split("@")[-1] if "@" in database_url else database_url,
|
||||
)
|
||||
if schema_name:
|
||||
_LOGGER.info("Schema: %s", schema_name)
|
||||
_LOGGER.info("=" * 70)
|
||||
|
||||
# Create engine and session with schema support
|
||||
connect_args = {}
|
||||
if schema_name:
|
||||
connect_args = {
|
||||
"server_settings": {"search_path": f"{schema_name},public"}
|
||||
}
|
||||
|
||||
engine = create_async_engine(database_url, echo=False, connect_args=connect_args)
|
||||
SessionMaker = async_sessionmaker(engine, expire_on_commit=False)
|
||||
|
||||
# Determine schema prefix for SQL statements
|
||||
schema_prefix = f"{schema_name}." if schema_name else ""
|
||||
|
||||
try:
|
||||
# Migrate datetime columns first
|
||||
async with SessionMaker() as session:
|
||||
await migrate_datetime_columns(session, schema_prefix)
|
||||
|
||||
# Then fix sequences
|
||||
async with SessionMaker() as session:
|
||||
# List of tables and their sequence names
|
||||
tables = [
|
||||
("customers", "customers_id_seq"),
|
||||
("hashed_customers", "hashed_customers_id_seq"),
|
||||
("reservations", "reservations_id_seq"),
|
||||
("acked_requests", "acked_requests_id_seq"),
|
||||
("conversions", "conversions_id_seq"),
|
||||
]
|
||||
|
||||
_LOGGER.info("\nResetting sequences...")
|
||||
for table_name, sequence_name in tables:
|
||||
full_table = f"{schema_prefix}{table_name}"
|
||||
full_sequence = f"{schema_prefix}{sequence_name}"
|
||||
|
||||
# Get current max ID
|
||||
result = await session.execute(
|
||||
text(f"SELECT MAX(id) FROM {full_table}")
|
||||
)
|
||||
max_id = result.scalar()
|
||||
|
||||
# Get current sequence value
|
||||
result = await session.execute(
|
||||
text(f"SELECT last_value FROM {full_sequence}")
|
||||
)
|
||||
current_seq = result.scalar()
|
||||
|
||||
if max_id is None:
|
||||
_LOGGER.info(f" {full_table}: empty table, setting sequence to 1")
|
||||
await session.execute(
|
||||
text(f"SELECT setval('{full_sequence}', 1, false)")
|
||||
)
|
||||
elif current_seq <= max_id:
|
||||
new_seq = max_id + 1
|
||||
_LOGGER.info(
|
||||
f" {full_table}: max_id={max_id}, "
|
||||
f"old_seq={current_seq}, new_seq={new_seq}"
|
||||
)
|
||||
await session.execute(
|
||||
text(f"SELECT setval('{full_sequence}', {new_seq}, false)")
|
||||
)
|
||||
else:
|
||||
_LOGGER.info(
|
||||
f" {full_table}: sequence already correct "
|
||||
f"(max_id={max_id}, seq={current_seq})"
|
||||
)
|
||||
|
||||
await session.commit()
|
||||
|
||||
_LOGGER.info("\n" + "=" * 70)
|
||||
_LOGGER.info("✓ Migration completed successfully!")
|
||||
_LOGGER.info("=" * 70)
|
||||
_LOGGER.info("\nChanges applied:")
|
||||
_LOGGER.info(" 1. DateTime columns are now timezone-aware (TIMESTAMPTZ)")
|
||||
_LOGGER.info(" 2. Sequences are reset to match current max IDs")
|
||||
_LOGGER.info("\nYou can now insert new records without conflicts.")
|
||||
|
||||
except Exception as e:
|
||||
_LOGGER.exception("Failed to fix sequences: %s", e)
|
||||
raise
|
||||
|
||||
finally:
|
||||
await engine.dispose()
|
||||
|
||||
|
||||
async def main():
|
||||
"""Run the sequence fix."""
|
||||
parser = argparse.ArgumentParser(
|
||||
description="Fix PostgreSQL sequences after SQLite migration",
|
||||
formatter_class=argparse.RawDescriptionHelpFormatter,
|
||||
epilog=__doc__,
|
||||
)
|
||||
parser.add_argument(
|
||||
"--database-url",
|
||||
help="PostgreSQL database URL (default: from config or DATABASE_URL env var)",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--config",
|
||||
help=(
|
||||
"Path to config file containing PostgreSQL database URL "
|
||||
"(keeps password out of bash history)"
|
||||
),
|
||||
)
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
# Determine database URL and schema
|
||||
schema_name = None
|
||||
if args.config:
|
||||
# Load config file as plain YAML (no validation)
|
||||
_LOGGER.info("Loading database config from: %s", args.config)
|
||||
try:
|
||||
with open(args.config) as f:
|
||||
config = yaml.safe_load(f)
|
||||
|
||||
# Get database URL
|
||||
if not config or "database" not in config or "url" not in config["database"]:
|
||||
_LOGGER.error("Config file must contain database.url")
|
||||
sys.exit(1)
|
||||
|
||||
database_url = config["database"]["url"]
|
||||
schema_name = config.get("database", {}).get("schema")
|
||||
_LOGGER.info("Successfully loaded config")
|
||||
except FileNotFoundError:
|
||||
_LOGGER.error("Config file not found: %s", args.config)
|
||||
sys.exit(1)
|
||||
except yaml.YAMLError as e:
|
||||
_LOGGER.error("Failed to parse YAML config: %s", e)
|
||||
sys.exit(1)
|
||||
elif args.database_url:
|
||||
database_url = args.database_url
|
||||
# Get schema from environment variable
|
||||
schema_name = os.environ.get("DATABASE_SCHEMA")
|
||||
else:
|
||||
# Try environment variable or default config.yaml
|
||||
database_url = os.environ.get("DATABASE_URL")
|
||||
schema_name = os.environ.get("DATABASE_SCHEMA")
|
||||
|
||||
if not database_url:
|
||||
# Try to load from default config.yaml as plain YAML
|
||||
try:
|
||||
with open("config/config.yaml") as f:
|
||||
config = yaml.safe_load(f)
|
||||
database_url = config.get("database", {}).get("url")
|
||||
if not schema_name:
|
||||
schema_name = config.get("database", {}).get("schema")
|
||||
except Exception:
|
||||
pass # Ignore if default config doesn't exist
|
||||
|
||||
if not database_url:
|
||||
_LOGGER.error("No database URL provided")
|
||||
_LOGGER.error("Provide via --config, --database-url, or DATABASE_URL env var")
|
||||
sys.exit(1)
|
||||
|
||||
if "postgresql" not in database_url and "postgres" not in database_url:
|
||||
_LOGGER.error("This script only works with PostgreSQL databases.")
|
||||
url_type = database_url.split("+")[0] if "+" in database_url else "unknown"
|
||||
_LOGGER.error("Current database URL type detected: %s", url_type)
|
||||
_LOGGER.error("\nSpecify PostgreSQL database using one of:")
|
||||
_LOGGER.error(" - --config config/postgres.yaml")
|
||||
_LOGGER.error(" - DATABASE_URL environment variable")
|
||||
_LOGGER.error(" - --database-url postgresql+asyncpg://user:pass@host/db")
|
||||
sys.exit(1)
|
||||
|
||||
# Run the fix
|
||||
await fix_sequences(database_url, schema_name)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(main())
|
||||
119
src/alpine_bits_python/util/migrate_add_room_types.py
Normal file
119
src/alpine_bits_python/util/migrate_add_room_types.py
Normal file
@@ -0,0 +1,119 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Migration script to add RoomTypes fields to Reservation table.
|
||||
|
||||
This migration adds three optional fields to the reservations table:
|
||||
- room_type_code: String (max 8 chars)
|
||||
- room_classification_code: String (numeric pattern)
|
||||
- room_type: String (enum: 1-5)
|
||||
|
||||
This script can be run manually before starting the server, or the changes
|
||||
will be applied automatically when the server starts via Base.metadata.create_all.
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
# Add parent directory to path so we can import alpine_bits_python
|
||||
sys.path.insert(0, str(Path(__file__).parent.parent.parent))
|
||||
|
||||
from sqlalchemy import inspect, text
|
||||
from sqlalchemy.ext.asyncio import create_async_engine
|
||||
|
||||
from alpine_bits_python.config_loader import load_config
|
||||
from alpine_bits_python.db import get_database_url
|
||||
from alpine_bits_python.logging_config import get_logger, setup_logging
|
||||
|
||||
_LOGGER = get_logger(__name__)
|
||||
|
||||
|
||||
async def check_columns_exist(engine, table_name: str, columns: list[str]) -> dict[str, bool]:
|
||||
"""Check which columns exist in the table.
|
||||
|
||||
Returns a dict mapping column name to whether it exists.
|
||||
"""
|
||||
async with engine.connect() as conn:
|
||||
def _check(connection):
|
||||
inspector = inspect(connection)
|
||||
existing_cols = [col['name'] for col in inspector.get_columns(table_name)]
|
||||
return {col: col in existing_cols for col in columns}
|
||||
|
||||
result = await conn.run_sync(_check)
|
||||
return result
|
||||
|
||||
|
||||
async def add_room_types_columns(engine):
|
||||
"""Add RoomTypes columns to reservations table if they don't exist."""
|
||||
from alpine_bits_python.db import Base
|
||||
|
||||
table_name = "reservations"
|
||||
columns_to_add = ["room_type_code", "room_classification_code", "room_type"]
|
||||
|
||||
# First, ensure the table exists by creating all tables if needed
|
||||
_LOGGER.info("Ensuring database tables exist...")
|
||||
async with engine.begin() as conn:
|
||||
await conn.run_sync(Base.metadata.create_all)
|
||||
_LOGGER.info("Database tables checked/created.")
|
||||
|
||||
_LOGGER.info("Checking which columns need to be added to %s table...", table_name)
|
||||
|
||||
# Check which columns already exist
|
||||
columns_exist = await check_columns_exist(engine, table_name, columns_to_add)
|
||||
|
||||
columns_to_create = [col for col, exists in columns_exist.items() if not exists]
|
||||
|
||||
if not columns_to_create:
|
||||
_LOGGER.info("All RoomTypes columns already exist in %s table. No migration needed.", table_name)
|
||||
return
|
||||
|
||||
_LOGGER.info("Adding columns to %s table: %s", table_name, ", ".join(columns_to_create))
|
||||
|
||||
# Build ALTER TABLE statements for missing columns
|
||||
# Note: SQLite supports ALTER TABLE ADD COLUMN but not ADD MULTIPLE COLUMNS
|
||||
async with engine.begin() as conn:
|
||||
for column in columns_to_create:
|
||||
sql = f"ALTER TABLE {table_name} ADD COLUMN {column} VARCHAR"
|
||||
_LOGGER.info("Executing: %s", sql)
|
||||
await conn.execute(text(sql))
|
||||
|
||||
_LOGGER.info("Successfully added %d columns to %s table", len(columns_to_create), table_name)
|
||||
|
||||
|
||||
async def main():
|
||||
"""Run the migration."""
|
||||
try:
|
||||
# Load config
|
||||
config = load_config()
|
||||
setup_logging(config)
|
||||
except Exception as e:
|
||||
_LOGGER.warning("Failed to load config: %s. Using defaults.", e)
|
||||
config = {}
|
||||
|
||||
_LOGGER.info("=" * 60)
|
||||
_LOGGER.info("Starting RoomTypes Migration")
|
||||
_LOGGER.info("=" * 60)
|
||||
|
||||
# Get database URL
|
||||
database_url = get_database_url(config)
|
||||
_LOGGER.info("Database URL: %s", database_url.replace("://", "://***:***@").split("@")[-1])
|
||||
|
||||
# Create engine
|
||||
engine = create_async_engine(database_url, echo=False)
|
||||
|
||||
try:
|
||||
# Run migration
|
||||
await add_room_types_columns(engine)
|
||||
|
||||
_LOGGER.info("=" * 60)
|
||||
_LOGGER.info("Migration completed successfully!")
|
||||
_LOGGER.info("=" * 60)
|
||||
|
||||
except Exception as e:
|
||||
_LOGGER.exception("Migration failed: %s", e)
|
||||
sys.exit(1)
|
||||
finally:
|
||||
await engine.dispose()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(main())
|
||||
515
src/alpine_bits_python/util/migrate_sqlite_to_postgres.py
Normal file
515
src/alpine_bits_python/util/migrate_sqlite_to_postgres.py
Normal file
@@ -0,0 +1,515 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Migration script to copy data from SQLite to PostgreSQL.
|
||||
|
||||
This script:
|
||||
1. Connects to both SQLite and PostgreSQL databases
|
||||
2. Reads all data from SQLite using SQLAlchemy models
|
||||
3. Writes data to PostgreSQL using the same models
|
||||
4. Ensures data integrity and provides progress feedback
|
||||
|
||||
Prerequisites:
|
||||
- PostgreSQL database must be created and empty (or you can use --drop-tables flag)
|
||||
- asyncpg must be installed: uv pip install asyncpg
|
||||
- Configure target PostgreSQL URL in config.yaml or via DATABASE_URL env var
|
||||
|
||||
Usage:
|
||||
# Dry run (preview what will be migrated)
|
||||
uv run python -m alpine_bits_python.util.migrate_sqlite_to_postgres --dry-run
|
||||
|
||||
# Actual migration using target config file
|
||||
uv run python -m alpine_bits_python.util.migrate_sqlite_to_postgres \
|
||||
--target-config config/postgres.yaml
|
||||
|
||||
# Drop existing tables first (careful!)
|
||||
uv run python -m alpine_bits_python.util.migrate_sqlite_to_postgres \
|
||||
--target-config config/postgres.yaml --drop-tables
|
||||
|
||||
# Alternative: use DATABASE_URL environment variable
|
||||
DATABASE_URL="postgresql+asyncpg://user:pass@host/db" \
|
||||
uv run python -m alpine_bits_python.util.migrate_sqlite_to_postgres
|
||||
|
||||
# Alternative: specify URLs directly
|
||||
uv run python -m alpine_bits_python.util.migrate_sqlite_to_postgres \
|
||||
--source sqlite+aiosqlite:///old.db \
|
||||
--target postgresql+asyncpg://user:pass@localhost/dbname
|
||||
"""
|
||||
|
||||
import argparse
|
||||
import asyncio
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
# Add parent directory to path so we can import alpine_bits_python
|
||||
sys.path.insert(0, str(Path(__file__).parent.parent.parent))
|
||||
|
||||
import yaml
|
||||
from sqlalchemy import select, text
|
||||
from sqlalchemy.ext.asyncio import AsyncSession, async_sessionmaker, create_async_engine
|
||||
|
||||
from alpine_bits_python.config_loader import load_config
|
||||
from alpine_bits_python.db import (
|
||||
AckedRequest,
|
||||
Base,
|
||||
Customer,
|
||||
HashedCustomer,
|
||||
Reservation,
|
||||
get_database_url,
|
||||
)
|
||||
from alpine_bits_python.logging_config import get_logger, setup_logging
|
||||
|
||||
_LOGGER = get_logger(__name__)
|
||||
|
||||
|
||||
def mask_db_url(url: str) -> str:
|
||||
"""Mask sensitive parts of database URL for logging."""
|
||||
if "://" not in url:
|
||||
return url
|
||||
protocol, rest = url.split("://", 1)
|
||||
if "@" in rest:
|
||||
credentials, location = rest.split("@", 1)
|
||||
return f"{protocol}://***:***@{location}"
|
||||
return url
|
||||
|
||||
|
||||
async def get_table_counts(session: AsyncSession) -> dict[str, int]:
|
||||
"""Get row counts for all tables."""
|
||||
counts = {}
|
||||
|
||||
# Count customers
|
||||
result = await session.execute(select(Customer))
|
||||
counts["customers"] = len(result.scalars().all())
|
||||
|
||||
# Count hashed_customers
|
||||
result = await session.execute(select(HashedCustomer))
|
||||
counts["hashed_customers"] = len(result.scalars().all())
|
||||
|
||||
# Count reservations
|
||||
result = await session.execute(select(Reservation))
|
||||
counts["reservations"] = len(result.scalars().all())
|
||||
|
||||
# Count acked_requests
|
||||
result = await session.execute(select(AckedRequest))
|
||||
counts["acked_requests"] = len(result.scalars().all())
|
||||
|
||||
return counts
|
||||
|
||||
|
||||
async def reset_sequences(session: AsyncSession) -> None:
|
||||
"""Reset PostgreSQL sequences to match the current max ID values.
|
||||
|
||||
This is necessary after migrating data with explicit IDs from SQLite,
|
||||
as PostgreSQL sequences won't automatically advance when IDs are set explicitly.
|
||||
"""
|
||||
tables = [
|
||||
("customers", "customers_id_seq"),
|
||||
("hashed_customers", "hashed_customers_id_seq"),
|
||||
("reservations", "reservations_id_seq"),
|
||||
("acked_requests", "acked_requests_id_seq"),
|
||||
]
|
||||
|
||||
for table_name, sequence_name in tables:
|
||||
# Set sequence to max(id) + 1, or 1 if table is empty
|
||||
query = text(f"""
|
||||
SELECT setval('{sequence_name}',
|
||||
COALESCE((SELECT MAX(id) FROM {table_name}), 0) + 1,
|
||||
false)
|
||||
""")
|
||||
await session.execute(query)
|
||||
|
||||
await session.commit()
|
||||
|
||||
|
||||
async def migrate_data(
|
||||
source_url: str,
|
||||
target_url: str,
|
||||
dry_run: bool = False,
|
||||
drop_tables: bool = False,
|
||||
) -> None:
|
||||
"""Migrate data from source database to target database.
|
||||
|
||||
Args:
|
||||
source_url: Source database URL (SQLite)
|
||||
target_url: Target database URL (PostgreSQL)
|
||||
dry_run: If True, only preview what would be migrated
|
||||
drop_tables: If True, drop existing tables in target before creating
|
||||
"""
|
||||
_LOGGER.info("=" * 70)
|
||||
_LOGGER.info("SQLite to PostgreSQL Migration")
|
||||
_LOGGER.info("=" * 70)
|
||||
_LOGGER.info("Source: %s", mask_db_url(source_url))
|
||||
_LOGGER.info("Target: %s", mask_db_url(target_url))
|
||||
_LOGGER.info("Mode: %s", "DRY RUN" if dry_run else "LIVE MIGRATION")
|
||||
_LOGGER.info("=" * 70)
|
||||
|
||||
# Create engines
|
||||
_LOGGER.info("Creating database connections...")
|
||||
source_engine = create_async_engine(source_url, echo=False)
|
||||
target_engine = create_async_engine(target_url, echo=False)
|
||||
|
||||
# Create session makers
|
||||
SourceSession = async_sessionmaker(source_engine, expire_on_commit=False)
|
||||
TargetSession = async_sessionmaker(target_engine, expire_on_commit=False)
|
||||
|
||||
try:
|
||||
# Check source database
|
||||
_LOGGER.info("\nChecking source database...")
|
||||
async with SourceSession() as source_session:
|
||||
source_counts = await get_table_counts(source_session)
|
||||
|
||||
_LOGGER.info("Source database contains:")
|
||||
for table, count in source_counts.items():
|
||||
_LOGGER.info(" - %s: %d rows", table, count)
|
||||
|
||||
total_rows = sum(source_counts.values())
|
||||
if total_rows == 0:
|
||||
_LOGGER.warning("Source database is empty. Nothing to migrate.")
|
||||
return
|
||||
|
||||
if dry_run:
|
||||
_LOGGER.info("\n" + "=" * 70)
|
||||
_LOGGER.info("DRY RUN: Would migrate %d total rows", total_rows)
|
||||
_LOGGER.info("=" * 70)
|
||||
return
|
||||
|
||||
# Prepare target database
|
||||
_LOGGER.info("\nPreparing target database...")
|
||||
|
||||
if drop_tables:
|
||||
_LOGGER.warning("Dropping existing tables in target database...")
|
||||
async with target_engine.begin() as conn:
|
||||
await conn.run_sync(Base.metadata.drop_all)
|
||||
_LOGGER.info("Tables dropped.")
|
||||
|
||||
_LOGGER.info("Creating tables in target database...")
|
||||
async with target_engine.begin() as conn:
|
||||
await conn.run_sync(Base.metadata.create_all)
|
||||
_LOGGER.info("Tables created.")
|
||||
|
||||
# Check if target already has data
|
||||
_LOGGER.info("\nChecking target database...")
|
||||
async with TargetSession() as target_session:
|
||||
target_counts = await get_table_counts(target_session)
|
||||
|
||||
if sum(target_counts.values()) > 0:
|
||||
_LOGGER.warning("Target database is not empty:")
|
||||
for table, count in target_counts.items():
|
||||
if count > 0:
|
||||
_LOGGER.warning(" - %s: %d rows", table, count)
|
||||
|
||||
response = input("\nContinue anyway? This may cause conflicts. (yes/no): ")
|
||||
if response.lower() != "yes":
|
||||
_LOGGER.info("Migration cancelled.")
|
||||
return
|
||||
|
||||
# Migrate data table by table
|
||||
_LOGGER.info("\n" + "=" * 70)
|
||||
_LOGGER.info("Starting data migration...")
|
||||
_LOGGER.info("=" * 70)
|
||||
|
||||
# 1. Migrate Customers first (no dependencies)
|
||||
_LOGGER.info("\n[1/4] Migrating Customers...")
|
||||
async with SourceSession() as source_session:
|
||||
result = await source_session.execute(select(Customer))
|
||||
customers = result.scalars().all()
|
||||
|
||||
if customers:
|
||||
async with TargetSession() as target_session:
|
||||
for i, customer in enumerate(customers, 1):
|
||||
# Create new instance with same data
|
||||
new_customer = Customer(
|
||||
id=customer.id,
|
||||
given_name=customer.given_name,
|
||||
contact_id=customer.contact_id,
|
||||
surname=customer.surname,
|
||||
name_prefix=customer.name_prefix,
|
||||
email_address=customer.email_address,
|
||||
phone=customer.phone,
|
||||
email_newsletter=customer.email_newsletter,
|
||||
address_line=customer.address_line,
|
||||
city_name=customer.city_name,
|
||||
postal_code=customer.postal_code,
|
||||
country_code=customer.country_code,
|
||||
gender=customer.gender,
|
||||
birth_date=customer.birth_date,
|
||||
language=customer.language,
|
||||
address_catalog=customer.address_catalog,
|
||||
name_title=customer.name_title,
|
||||
)
|
||||
target_session.add(new_customer)
|
||||
|
||||
if i % 100 == 0:
|
||||
_LOGGER.info(" Progress: %d/%d customers", i, len(customers))
|
||||
|
||||
await target_session.commit()
|
||||
|
||||
_LOGGER.info("✓ Migrated %d customers", len(customers))
|
||||
|
||||
# 2. Migrate HashedCustomers (depends on Customers)
|
||||
_LOGGER.info("\n[2/4] Migrating HashedCustomers...")
|
||||
async with SourceSession() as source_session:
|
||||
result = await source_session.execute(select(HashedCustomer))
|
||||
hashed_customers = result.scalars().all()
|
||||
|
||||
if hashed_customers:
|
||||
async with TargetSession() as target_session:
|
||||
for i, hashed in enumerate(hashed_customers, 1):
|
||||
new_hashed = HashedCustomer(
|
||||
id=hashed.id,
|
||||
customer_id=hashed.customer_id,
|
||||
contact_id=hashed.contact_id,
|
||||
hashed_email=hashed.hashed_email,
|
||||
hashed_phone=hashed.hashed_phone,
|
||||
hashed_given_name=hashed.hashed_given_name,
|
||||
hashed_surname=hashed.hashed_surname,
|
||||
hashed_city=hashed.hashed_city,
|
||||
hashed_postal_code=hashed.hashed_postal_code,
|
||||
hashed_country_code=hashed.hashed_country_code,
|
||||
hashed_gender=hashed.hashed_gender,
|
||||
hashed_birth_date=hashed.hashed_birth_date,
|
||||
created_at=hashed.created_at,
|
||||
)
|
||||
target_session.add(new_hashed)
|
||||
|
||||
if i % 100 == 0:
|
||||
_LOGGER.info(" Progress: %d/%d hashed customers", i, len(hashed_customers))
|
||||
|
||||
await target_session.commit()
|
||||
|
||||
_LOGGER.info("✓ Migrated %d hashed customers", len(hashed_customers))
|
||||
|
||||
# 3. Migrate Reservations (depends on Customers)
|
||||
_LOGGER.info("\n[3/4] Migrating Reservations...")
|
||||
async with SourceSession() as source_session:
|
||||
result = await source_session.execute(select(Reservation))
|
||||
reservations = result.scalars().all()
|
||||
|
||||
if reservations:
|
||||
async with TargetSession() as target_session:
|
||||
for i, reservation in enumerate(reservations, 1):
|
||||
new_reservation = Reservation(
|
||||
id=reservation.id,
|
||||
customer_id=reservation.customer_id,
|
||||
unique_id=reservation.unique_id,
|
||||
md5_unique_id=reservation.md5_unique_id,
|
||||
start_date=reservation.start_date,
|
||||
end_date=reservation.end_date,
|
||||
num_adults=reservation.num_adults,
|
||||
num_children=reservation.num_children,
|
||||
children_ages=reservation.children_ages,
|
||||
offer=reservation.offer,
|
||||
created_at=reservation.created_at,
|
||||
utm_source=reservation.utm_source,
|
||||
utm_medium=reservation.utm_medium,
|
||||
utm_campaign=reservation.utm_campaign,
|
||||
utm_term=reservation.utm_term,
|
||||
utm_content=reservation.utm_content,
|
||||
user_comment=reservation.user_comment,
|
||||
fbclid=reservation.fbclid,
|
||||
gclid=reservation.gclid,
|
||||
hotel_code=reservation.hotel_code,
|
||||
hotel_name=reservation.hotel_name,
|
||||
room_type_code=reservation.room_type_code,
|
||||
room_classification_code=reservation.room_classification_code,
|
||||
room_type=reservation.room_type,
|
||||
)
|
||||
target_session.add(new_reservation)
|
||||
|
||||
if i % 100 == 0:
|
||||
_LOGGER.info(" Progress: %d/%d reservations", i, len(reservations))
|
||||
|
||||
await target_session.commit()
|
||||
|
||||
_LOGGER.info("✓ Migrated %d reservations", len(reservations))
|
||||
|
||||
# 4. Migrate AckedRequests (no dependencies)
|
||||
_LOGGER.info("\n[4/4] Migrating AckedRequests...")
|
||||
async with SourceSession() as source_session:
|
||||
result = await source_session.execute(select(AckedRequest))
|
||||
acked_requests = result.scalars().all()
|
||||
|
||||
if acked_requests:
|
||||
async with TargetSession() as target_session:
|
||||
for i, acked in enumerate(acked_requests, 1):
|
||||
new_acked = AckedRequest(
|
||||
id=acked.id,
|
||||
client_id=acked.client_id,
|
||||
unique_id=acked.unique_id,
|
||||
timestamp=acked.timestamp,
|
||||
)
|
||||
target_session.add(new_acked)
|
||||
|
||||
if i % 100 == 0:
|
||||
_LOGGER.info(" Progress: %d/%d acked requests", i, len(acked_requests))
|
||||
|
||||
await target_session.commit()
|
||||
|
||||
_LOGGER.info("✓ Migrated %d acked requests", len(acked_requests))
|
||||
|
||||
# Migrate datetime columns to timezone-aware
|
||||
_LOGGER.info("\n[5/6] Converting DateTime columns to timezone-aware...")
|
||||
async with target_engine.begin() as conn:
|
||||
await conn.execute(
|
||||
text(
|
||||
"ALTER TABLE hashed_customers "
|
||||
"ALTER COLUMN created_at TYPE TIMESTAMP WITH TIME ZONE"
|
||||
)
|
||||
)
|
||||
await conn.execute(
|
||||
text(
|
||||
"ALTER TABLE reservations "
|
||||
"ALTER COLUMN created_at TYPE TIMESTAMP WITH TIME ZONE"
|
||||
)
|
||||
)
|
||||
await conn.execute(
|
||||
text(
|
||||
"ALTER TABLE acked_requests "
|
||||
"ALTER COLUMN timestamp TYPE TIMESTAMP WITH TIME ZONE"
|
||||
)
|
||||
)
|
||||
_LOGGER.info("✓ DateTime columns converted to timezone-aware")
|
||||
|
||||
# Reset PostgreSQL sequences
|
||||
_LOGGER.info("\n[6/6] Resetting PostgreSQL sequences...")
|
||||
async with TargetSession() as target_session:
|
||||
await reset_sequences(target_session)
|
||||
_LOGGER.info("✓ Sequences reset to match current max IDs")
|
||||
|
||||
# Verify migration
|
||||
_LOGGER.info("\n" + "=" * 70)
|
||||
_LOGGER.info("Verifying migration...")
|
||||
_LOGGER.info("=" * 70)
|
||||
|
||||
async with TargetSession() as target_session:
|
||||
final_counts = await get_table_counts(target_session)
|
||||
|
||||
_LOGGER.info("Target database now contains:")
|
||||
all_match = True
|
||||
for table, count in final_counts.items():
|
||||
source_count = source_counts[table]
|
||||
match = "✓" if count == source_count else "✗"
|
||||
_LOGGER.info(" %s %s: %d rows (source: %d)", match, table, count, source_count)
|
||||
if count != source_count:
|
||||
all_match = False
|
||||
|
||||
if all_match:
|
||||
_LOGGER.info("\n" + "=" * 70)
|
||||
_LOGGER.info("✓ Migration completed successfully!")
|
||||
_LOGGER.info("=" * 70)
|
||||
_LOGGER.info("\nNext steps:")
|
||||
_LOGGER.info("1. Test your application with PostgreSQL")
|
||||
_LOGGER.info("2. Update config.yaml or DATABASE_URL to use PostgreSQL")
|
||||
_LOGGER.info("3. Keep SQLite backup until you're confident everything works")
|
||||
else:
|
||||
_LOGGER.error("\n" + "=" * 70)
|
||||
_LOGGER.error("✗ Migration completed with mismatches!")
|
||||
_LOGGER.error("=" * 70)
|
||||
_LOGGER.error("Please review the counts above and investigate.")
|
||||
|
||||
except Exception as e:
|
||||
_LOGGER.exception("Migration failed: %s", e)
|
||||
raise
|
||||
|
||||
finally:
|
||||
await source_engine.dispose()
|
||||
await target_engine.dispose()
|
||||
|
||||
|
||||
async def main():
|
||||
"""Run the migration."""
|
||||
parser = argparse.ArgumentParser(
|
||||
description="Migrate data from SQLite to PostgreSQL",
|
||||
formatter_class=argparse.RawDescriptionHelpFormatter,
|
||||
epilog=__doc__,
|
||||
)
|
||||
parser.add_argument(
|
||||
"--source",
|
||||
help="Source database URL (default: from config or sqlite+aiosqlite:///alpinebits.db)",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--target",
|
||||
help=(
|
||||
"Target database URL "
|
||||
"(default: from DATABASE_URL env var or --target-config)"
|
||||
),
|
||||
)
|
||||
parser.add_argument(
|
||||
"--target-config",
|
||||
help=(
|
||||
"Path to config file containing target PostgreSQL database URL "
|
||||
"(keeps password out of bash history)"
|
||||
),
|
||||
)
|
||||
parser.add_argument(
|
||||
"--dry-run",
|
||||
action="store_true",
|
||||
help="Preview migration without making changes",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--drop-tables",
|
||||
action="store_true",
|
||||
help="Drop existing tables in target database before migration",
|
||||
)
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
try:
|
||||
# Load config
|
||||
config = load_config()
|
||||
setup_logging(config)
|
||||
except Exception as e:
|
||||
_LOGGER.warning("Failed to load config: %s. Using defaults.", e)
|
||||
config = {}
|
||||
|
||||
# Determine source URL (default to SQLite)
|
||||
if args.source:
|
||||
source_url = args.source
|
||||
else:
|
||||
source_url = get_database_url(config)
|
||||
if "sqlite" not in source_url:
|
||||
_LOGGER.error("Source database must be SQLite. Use --source to specify.")
|
||||
sys.exit(1)
|
||||
|
||||
# Determine target URL (must be PostgreSQL)
|
||||
if args.target:
|
||||
target_url = args.target
|
||||
elif args.target_config:
|
||||
# Load target config file manually (simpler YAML without secrets)
|
||||
_LOGGER.info("Loading target database config from: %s", args.target_config)
|
||||
try:
|
||||
config_path = Path(args.target_config)
|
||||
with config_path.open() as f:
|
||||
target_config = yaml.safe_load(f)
|
||||
target_url = target_config["database"]["url"]
|
||||
_LOGGER.info("Successfully loaded target config")
|
||||
except (FileNotFoundError, ValueError, KeyError):
|
||||
_LOGGER.exception("Failed to load target config")
|
||||
_LOGGER.info(
|
||||
"Config file should contain: database.url with PostgreSQL connection"
|
||||
)
|
||||
sys.exit(1)
|
||||
else:
|
||||
import os
|
||||
target_url = os.environ.get("DATABASE_URL")
|
||||
if not target_url:
|
||||
_LOGGER.error("Target database URL not specified.")
|
||||
_LOGGER.error("Specify target database using one of:")
|
||||
_LOGGER.error(" - --target-config config/postgres.yaml")
|
||||
_LOGGER.error(" - DATABASE_URL environment variable")
|
||||
_LOGGER.error(" - --target postgresql+asyncpg://user:pass@host/db")
|
||||
sys.exit(1)
|
||||
|
||||
if "postgresql" not in target_url and "postgres" not in target_url:
|
||||
_LOGGER.error("Target database must be PostgreSQL.")
|
||||
sys.exit(1)
|
||||
|
||||
# Run migration
|
||||
await migrate_data(
|
||||
source_url=source_url,
|
||||
target_url=target_url,
|
||||
dry_run=args.dry_run,
|
||||
drop_tables=args.drop_tables,
|
||||
)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(main())
|
||||
165
src/alpine_bits_python/worker_coordination.py
Normal file
165
src/alpine_bits_python/worker_coordination.py
Normal file
@@ -0,0 +1,165 @@
|
||||
"""Worker coordination utilities for multi-worker FastAPI deployments.
|
||||
|
||||
This module provides utilities to ensure singleton services (schedulers, background tasks)
|
||||
run on only one worker when using uvicorn --workers N.
|
||||
"""
|
||||
|
||||
import fcntl
|
||||
import os
|
||||
from pathlib import Path
|
||||
from typing import ContextManager
|
||||
|
||||
from .logging_config import get_logger
|
||||
|
||||
_LOGGER = get_logger(__name__)
|
||||
|
||||
|
||||
class WorkerLock:
|
||||
"""File-based lock to coordinate worker processes.
|
||||
|
||||
Only one worker can hold the lock at a time. This ensures singleton
|
||||
services like schedulers only run on one worker.
|
||||
"""
|
||||
|
||||
def __init__(self, lock_file: str | None = None):
|
||||
"""Initialize the worker lock.
|
||||
|
||||
Args:
|
||||
lock_file: Path to the lock file. If None, will try /var/run first,
|
||||
falling back to /tmp if /var/run is not writable.
|
||||
"""
|
||||
if lock_file is None:
|
||||
# Try /var/run first (more persistent), fall back to /tmp
|
||||
for candidate in ["/var/run/alpinebits_primary_worker.lock",
|
||||
"/tmp/alpinebits_primary_worker.lock"]:
|
||||
try:
|
||||
candidate_path = Path(candidate)
|
||||
candidate_path.parent.mkdir(parents=True, exist_ok=True)
|
||||
# Test if we can write to this location
|
||||
test_file = candidate_path.parent / ".alpinebits_test"
|
||||
test_file.touch()
|
||||
test_file.unlink()
|
||||
lock_file = candidate
|
||||
break
|
||||
except (PermissionError, OSError):
|
||||
continue
|
||||
else:
|
||||
# If all fail, default to /tmp
|
||||
lock_file = "/tmp/alpinebits_primary_worker.lock"
|
||||
|
||||
self.lock_file = Path(lock_file)
|
||||
self.lock_fd = None
|
||||
self.is_primary = False
|
||||
|
||||
def acquire(self) -> bool:
|
||||
"""Try to acquire the primary worker lock.
|
||||
|
||||
Returns:
|
||||
True if lock was acquired (this is the primary worker)
|
||||
False if lock is held by another worker
|
||||
"""
|
||||
try:
|
||||
# Create lock file if it doesn't exist
|
||||
self.lock_file.parent.mkdir(parents=True, exist_ok=True)
|
||||
|
||||
# Open lock file
|
||||
self.lock_fd = open(self.lock_file, "w")
|
||||
|
||||
# Try to acquire exclusive lock (non-blocking)
|
||||
fcntl.flock(self.lock_fd.fileno(), fcntl.LOCK_EX | fcntl.LOCK_NB)
|
||||
|
||||
# Write PID to lock file for debugging
|
||||
self.lock_fd.write(f"{os.getpid()}\n")
|
||||
self.lock_fd.flush()
|
||||
|
||||
self.is_primary = True
|
||||
_LOGGER.info(
|
||||
"Acquired primary worker lock (pid=%d, lock_file=%s)",
|
||||
os.getpid(),
|
||||
self.lock_file,
|
||||
)
|
||||
return True
|
||||
|
||||
except (IOError, OSError) as e:
|
||||
# Lock is held by another process
|
||||
if self.lock_fd:
|
||||
self.lock_fd.close()
|
||||
self.lock_fd = None
|
||||
|
||||
self.is_primary = False
|
||||
_LOGGER.info(
|
||||
"Could not acquire primary worker lock - another worker is primary (pid=%d)",
|
||||
os.getpid(),
|
||||
)
|
||||
return False
|
||||
|
||||
def release(self) -> None:
|
||||
"""Release the primary worker lock."""
|
||||
if self.lock_fd and self.is_primary:
|
||||
try:
|
||||
fcntl.flock(self.lock_fd.fileno(), fcntl.LOCK_UN)
|
||||
self.lock_fd.close()
|
||||
|
||||
# Try to remove lock file (best effort)
|
||||
try:
|
||||
self.lock_file.unlink()
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
_LOGGER.info("Released primary worker lock (pid=%d)", os.getpid())
|
||||
except Exception:
|
||||
_LOGGER.exception("Error releasing primary worker lock")
|
||||
finally:
|
||||
self.lock_fd = None
|
||||
self.is_primary = False
|
||||
|
||||
def __enter__(self) -> "WorkerLock":
|
||||
"""Context manager entry."""
|
||||
self.acquire()
|
||||
return self
|
||||
|
||||
def __exit__(self, exc_type, exc_val, exc_tb) -> None:
|
||||
"""Context manager exit."""
|
||||
self.release()
|
||||
|
||||
|
||||
def is_primary_worker() -> tuple[bool, WorkerLock | None]:
|
||||
"""Determine if this worker should run singleton services.
|
||||
|
||||
Uses file-based locking to coordinate between workers.
|
||||
Includes stale lock detection and cleanup.
|
||||
|
||||
Returns:
|
||||
Tuple of (is_primary, lock_object)
|
||||
- is_primary: True if this is the primary worker
|
||||
- lock_object: WorkerLock instance (must be kept alive)
|
||||
"""
|
||||
lock = WorkerLock()
|
||||
|
||||
# Check for stale locks from dead processes
|
||||
if lock.lock_file.exists():
|
||||
try:
|
||||
with open(lock.lock_file, 'r') as f:
|
||||
old_pid_str = f.read().strip()
|
||||
if old_pid_str:
|
||||
old_pid = int(old_pid_str)
|
||||
# Check if the process with this PID still exists
|
||||
try:
|
||||
os.kill(old_pid, 0) # Signal 0 just checks existence
|
||||
_LOGGER.debug("Lock held by active process pid=%d", old_pid)
|
||||
except ProcessLookupError:
|
||||
# Process is dead, remove stale lock
|
||||
_LOGGER.warning(
|
||||
"Removing stale lock file from dead process pid=%d",
|
||||
old_pid
|
||||
)
|
||||
try:
|
||||
lock.lock_file.unlink()
|
||||
except Exception as e:
|
||||
_LOGGER.warning("Failed to remove stale lock: %s", e)
|
||||
except (ValueError, FileNotFoundError, PermissionError) as e:
|
||||
_LOGGER.warning("Error checking lock file: %s", e)
|
||||
|
||||
is_primary = lock.acquire()
|
||||
|
||||
return is_primary, lock
|
||||
291
test_capi.py
Normal file
291
test_capi.py
Normal file
@@ -0,0 +1,291 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Test sending a test event to the Conversions Api from Meta."""
|
||||
|
||||
import asyncio
|
||||
import json
|
||||
import logging
|
||||
import time
|
||||
|
||||
from sqlalchemy.ext.asyncio import AsyncSession, create_async_engine
|
||||
from sqlalchemy.orm import sessionmaker
|
||||
from src.alpine_bits_python.customer_service import CustomerService
|
||||
from src.alpine_bits_python.db import Base
|
||||
from src.alpine_bits_python.reservation_service import ReservationService
|
||||
|
||||
# Set up logging
|
||||
logging.basicConfig(
|
||||
level=logging.INFO, format="%(asctime)s - %(name)s - %(levelname)s - %(message)s"
|
||||
)
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
TEST_CODE = "TEST54726"
|
||||
|
||||
# Meta CAPI configuration (placeholder values)
|
||||
PIXEL_ID = "539512870322352"
|
||||
ACCESS_TOKEN = "EAATsRaQOv94BPoib5XUn9ZBjNPfeZB4JlKJR1LYtiMdzbEoIa7XFDmHq3pY8UvOcHnbNYDraym107hwRd3EfzO8EpQ5ZB5C4OfF7KJ41KIrfQntkoWrCYcQReecd4vhzVk82hjm55yGDhkxzuNuzG85FZCT0nZB6VyIxZAVLR2estoUSAoQ06J742aMkZCN2AZDZD"
|
||||
CAPI_ENDPOINT = f"https://graph.facebook.com/v19.0/{PIXEL_ID}/events"
|
||||
|
||||
|
||||
async def load_test_data_from_db():
|
||||
"""Load reservations and hashed customers from the database."""
|
||||
# Connect to the test database
|
||||
db_url = "sqlite+aiosqlite:///alpinebits_capi_test.db"
|
||||
engine = create_async_engine(db_url, echo=False)
|
||||
|
||||
# Create tables if they don't exist
|
||||
async with engine.begin() as conn:
|
||||
await conn.run_sync(Base.metadata.create_all)
|
||||
|
||||
# Create async session
|
||||
async_session = sessionmaker(engine, class_=AsyncSession, expire_on_commit=False)
|
||||
|
||||
async with async_session() as session:
|
||||
# Initialize services
|
||||
reservation_service = ReservationService(session)
|
||||
customer_service = CustomerService(session)
|
||||
|
||||
# Get all reservations with customers
|
||||
reservations_with_customers = (
|
||||
await reservation_service.get_reservations_with_filters()
|
||||
)
|
||||
|
||||
if not reservations_with_customers:
|
||||
logger.warning("No reservations found in database")
|
||||
return []
|
||||
|
||||
logger.info("Found %d reservations", len(reservations_with_customers))
|
||||
|
||||
# Prepare data with hashed customer info
|
||||
result = []
|
||||
for reservation, customer in reservations_with_customers:
|
||||
# Get hashed customer data
|
||||
hashed_customer = await customer_service.get_hashed_customer(customer.id)
|
||||
|
||||
result.append(
|
||||
{
|
||||
"reservation": reservation,
|
||||
"customer": customer,
|
||||
"hashed_customer": hashed_customer,
|
||||
}
|
||||
)
|
||||
|
||||
await engine.dispose()
|
||||
return result
|
||||
|
||||
|
||||
def _build_user_data(hashed_customer):
|
||||
"""Build user_data dict from hashed customer information.
|
||||
|
||||
Args:
|
||||
hashed_customer: HashedCustomer database object with SHA256 hashed PII
|
||||
|
||||
Returns:
|
||||
dict: User data for Meta Conversions API
|
||||
|
||||
"""
|
||||
user_data = {}
|
||||
if not hashed_customer:
|
||||
return user_data
|
||||
|
||||
# Map hashed customer fields to Meta CAPI field names
|
||||
field_mapping = {
|
||||
"hashed_email": "em",
|
||||
"hashed_phone": "ph",
|
||||
"hashed_given_name": "fn",
|
||||
"hashed_surname": "ln",
|
||||
"hashed_city": "ct",
|
||||
"hashed_postal_code": "zp",
|
||||
"hashed_country_code": "country",
|
||||
"hashed_gender": "ge",
|
||||
"hashed_birth_date": "db",
|
||||
}
|
||||
|
||||
for source_field, target_field in field_mapping.items():
|
||||
value = getattr(hashed_customer, source_field, None)
|
||||
if value:
|
||||
user_data[target_field] = value
|
||||
|
||||
return user_data
|
||||
|
||||
|
||||
def _build_custom_data(reservation, booking_value):
|
||||
"""Build custom_data dict from reservation information.
|
||||
|
||||
Args:
|
||||
reservation: Reservation database object
|
||||
booking_value: Booking value in EUR
|
||||
|
||||
Returns:
|
||||
dict: Custom data for Meta Conversions API
|
||||
|
||||
"""
|
||||
custom_data = {
|
||||
"currency": "EUR",
|
||||
"value": booking_value,
|
||||
"content_type": "hotel_booking",
|
||||
}
|
||||
|
||||
# Add optional reservation details
|
||||
optional_fields = {
|
||||
"hotel_code": "hotel_code",
|
||||
"hotel_name": "hotel_name",
|
||||
"num_adults": "num_adults",
|
||||
"num_children": "num_children",
|
||||
}
|
||||
|
||||
for source_field, target_field in optional_fields.items():
|
||||
value = getattr(reservation, source_field, None)
|
||||
if value:
|
||||
custom_data[target_field] = value
|
||||
|
||||
# Add date fields with ISO format
|
||||
if reservation.start_date:
|
||||
custom_data["checkin_date"] = reservation.start_date.isoformat()
|
||||
if reservation.end_date:
|
||||
custom_data["checkout_date"] = reservation.end_date.isoformat()
|
||||
|
||||
return custom_data
|
||||
|
||||
|
||||
def _add_utm_parameters(custom_data, reservation):
|
||||
"""Add UTM parameters to custom_data if available.
|
||||
|
||||
Args:
|
||||
custom_data: Custom data dict to modify
|
||||
reservation: Reservation database object
|
||||
|
||||
"""
|
||||
utm_fields = ["utm_source", "utm_medium", "utm_campaign", "utm_term", "utm_content"]
|
||||
if any(getattr(reservation, field, None) for field in utm_fields):
|
||||
for field in utm_fields:
|
||||
custom_data[field] = getattr(reservation, field, None)
|
||||
|
||||
|
||||
def _format_fbc(fbclid, timestamp):
|
||||
"""Format Facebook Click ID (fbclid) to fbc parameter.
|
||||
|
||||
The fbc format is: fb.{subdomain_index}.{timestamp_ms}.{fbclid}
|
||||
|
||||
Args:
|
||||
fbclid: Facebook Click ID from the ad URL
|
||||
timestamp: DateTime object from reservation creation
|
||||
|
||||
Returns:
|
||||
str: Formatted fbc value for Meta Conversions API
|
||||
|
||||
"""
|
||||
# Extract timestamp in milliseconds
|
||||
timestamp_ms = int(timestamp.timestamp() * 1000)
|
||||
# Subdomain index is typically 1
|
||||
subdomain_index = 1
|
||||
return f"fb.{subdomain_index}.{timestamp_ms}.{fbclid}"
|
||||
|
||||
|
||||
def create_meta_capi_event(reservation, customer, hashed_customer):
|
||||
"""Create a Meta Conversions API event from reservation and customer data.
|
||||
|
||||
Args:
|
||||
reservation: Reservation database object
|
||||
customer: Customer database object (currently unused)
|
||||
hashed_customer: HashedCustomer database object with SHA256 hashed PII
|
||||
|
||||
Returns:
|
||||
dict: Formatted event for Meta Conversions API
|
||||
|
||||
"""
|
||||
del customer # Currently unused but kept for API consistency
|
||||
# Calculate booking value (example: random value between 500-2000 EUR)
|
||||
booking_value = 1250.00 # Euro
|
||||
|
||||
# Current timestamp
|
||||
event_time = int(time.time())
|
||||
|
||||
# Build user_data with hashed customer information
|
||||
user_data = _build_user_data(hashed_customer)
|
||||
|
||||
# Add tracking parameters if available
|
||||
if reservation.fbclid and reservation.created_at:
|
||||
# Format fbclid as fbc parameter
|
||||
user_data["fbc"] = _format_fbc(reservation.fbclid, reservation.created_at)
|
||||
if reservation.gclid:
|
||||
user_data["gclid"] = reservation.gclid
|
||||
|
||||
# Build custom_data
|
||||
custom_data = _build_custom_data(reservation, booking_value)
|
||||
|
||||
# Add UTM parameters to event
|
||||
_add_utm_parameters(custom_data, reservation)
|
||||
|
||||
# Return the event
|
||||
return {
|
||||
"event_name": "Purchase",
|
||||
"event_time": event_time,
|
||||
"event_id": reservation.unique_id, # Unique event ID for deduplication
|
||||
"event_source_url": "https://example.com/booking-confirmation",
|
||||
"action_source": "website",
|
||||
"user_data": user_data,
|
||||
"custom_data": custom_data,
|
||||
}
|
||||
|
||||
|
||||
async def send_test_event():
|
||||
"""Load data from DB and create test Meta CAPI event."""
|
||||
logger.info("Loading test data from database...")
|
||||
|
||||
# Load data
|
||||
test_data = await load_test_data_from_db()
|
||||
|
||||
if not test_data:
|
||||
logger.error("No test data available. Please add reservations to the database.")
|
||||
return
|
||||
|
||||
# Use the first reservation for testing
|
||||
data = test_data[0]
|
||||
reservation = data["reservation"]
|
||||
customer = data["customer"]
|
||||
hashed_customer = data["hashed_customer"]
|
||||
|
||||
logger.info("Using reservation: %s", reservation.unique_id)
|
||||
logger.info("Customer: %s %s", customer.given_name, customer.surname)
|
||||
|
||||
# Create the event
|
||||
event = create_meta_capi_event(reservation, customer, hashed_customer)
|
||||
|
||||
# Create the full payload with test_event_code at top level
|
||||
payload = {
|
||||
"data": [event],
|
||||
"test_event_code": TEST_CODE,
|
||||
}
|
||||
|
||||
# Log the event (pretty print)
|
||||
separator = "=" * 80
|
||||
logger.info("\n%s", separator)
|
||||
logger.info("META CONVERSIONS API EVENT")
|
||||
logger.info("%s", separator)
|
||||
logger.info("\nEndpoint: %s", CAPI_ENDPOINT)
|
||||
logger.info("\nPayload:\n%s", json.dumps(payload, indent=2))
|
||||
logger.info("\n%s", separator)
|
||||
|
||||
logger.info("\nNOTE: This is a test event. To actually send it:")
|
||||
logger.info("1. Set PIXEL_ID to your Meta Pixel ID")
|
||||
logger.info("2. Set ACCESS_TOKEN to your Meta access token")
|
||||
logger.info("3. Uncomment the httpx.post() call below")
|
||||
logger.info(
|
||||
"4. Test the event at: https://developers.facebook.com/tools/events_manager/"
|
||||
)
|
||||
logger.info(" Use test event code: %s", TEST_CODE)
|
||||
|
||||
# Uncomment to actually send the event
|
||||
|
||||
# async with httpx.AsyncClient() as client:
|
||||
# response = await client.post(
|
||||
# CAPI_ENDPOINT,
|
||||
# json=payload,
|
||||
# params={"access_token": ACCESS_TOKEN},
|
||||
# )
|
||||
# logger.info("Response status: %s", response.status_code)
|
||||
# logger.info("Response body: %s", response.text)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(send_test_event())
|
||||
@@ -1,29 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Test the handshake functionality with the real AlpineBits sample file."""
|
||||
|
||||
import asyncio
|
||||
|
||||
from alpine_bits_python.alpinebits_server import AlpineBitsServer
|
||||
|
||||
|
||||
async def main():
|
||||
|
||||
# Create server instance
|
||||
server = AlpineBitsServer()
|
||||
|
||||
# Read the sample handshake request
|
||||
with open(
|
||||
"AlpineBits-HotelData-2024-10/files/samples/Handshake/Handshake-OTA_PingRQ.xml",
|
||||
) as f:
|
||||
ping_request_xml = f.read()
|
||||
|
||||
|
||||
# Handle the ping request
|
||||
await server.handle_request(
|
||||
"OTA_Ping:Handshaking", ping_request_xml, "2024-10"
|
||||
)
|
||||
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(main())
|
||||
52
test_langdetect.py
Normal file
52
test_langdetect.py
Normal file
@@ -0,0 +1,52 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Test script for fast-langdetect library
|
||||
Tests language detection on various sample texts
|
||||
"""
|
||||
|
||||
from fast_langdetect import detect
|
||||
|
||||
# Test strings in different languages
|
||||
test_strings = [
|
||||
("Hello, how are you doing today?", "English"),
|
||||
("Bonjour, comment allez-vous aujourd'hui?", "French"),
|
||||
("Hola, ¿cómo estás hoy?", "Spanish"),
|
||||
(
|
||||
"Hallo, ich würde diese Wohnung gerne am 22.10.25 am späten Nachmittag besichtigen. Wir reisen aus Berlin an. Mit freundlichen Grüßen Dr. Christoph Garcia Bartels",
|
||||
"German",
|
||||
),
|
||||
("Ciao, come stai oggi?", "Italian"),
|
||||
("Olá, como você está hoje?", "Portuguese"),
|
||||
("Привет, как дела сегодня?", "Russian"),
|
||||
("こんにちは、今日はお元気ですか?", "Japanese"),
|
||||
("你好,你今天怎么样?", "Chinese"),
|
||||
(
|
||||
"Ciao, questo appartamento mi interessa e mi piacerebbe visitarlo. Grazie",
|
||||
"Italian",
|
||||
),
|
||||
("مرحبا، كيف حالك اليوم؟", "Arabic"),
|
||||
("Hej, hur mår du idag?", "Swedish"),
|
||||
(
|
||||
"Guten Tag! Koennte ich diese Wohnun bitte besichtigen kommen? Vielleicht sogar schon morgen, Mittwoch den 15.10.? Ich waere sowieso im Unterland und koennte gegen 12 Uhr dort sein. Danke fuer eine kurze Rueckmeldung diesbezueglich, Catherina",
|
||||
"German",
|
||||
),
|
||||
("Witam, jak się dzisiaj masz?", "Polish"),
|
||||
]
|
||||
|
||||
|
||||
def main():
|
||||
print("Testing fast-langdetect library")
|
||||
print("=" * 60)
|
||||
print()
|
||||
|
||||
for text, expected_lang in test_strings:
|
||||
detected = detect(text)
|
||||
print(f"Text: {text[:50]}...")
|
||||
print(f"Expected: {expected_lang}")
|
||||
print(f"Detected: {detected}")
|
||||
print("-" * 60)
|
||||
print()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
BIN
test_migration.db
Normal file
BIN
test_migration.db
Normal file
Binary file not shown.
322
test_pushover.py
Normal file
322
test_pushover.py
Normal file
@@ -0,0 +1,322 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Test script to verify Pushover push notification connectivity.
|
||||
|
||||
This script tests Pushover API connectivity and sends test notifications
|
||||
to help verify that the configuration is correct.
|
||||
"""
|
||||
|
||||
import sys
|
||||
import time
|
||||
from datetime import datetime
|
||||
|
||||
# Load configuration from config.yaml
|
||||
try:
|
||||
from alpine_bits_python.config_loader import load_config
|
||||
from pushover_complete import PushoverAPI
|
||||
|
||||
print("Loading configuration from config.yaml...")
|
||||
config = load_config()
|
||||
pushover_config = config.get("pushover", {})
|
||||
|
||||
USER_KEY = pushover_config.get("user_key", "")
|
||||
API_TOKEN = pushover_config.get("api_token", "")
|
||||
|
||||
# Get monitoring configuration
|
||||
monitoring_config = pushover_config.get("monitoring", {})
|
||||
daily_report_config = monitoring_config.get("daily_report", {})
|
||||
error_alerts_config = monitoring_config.get("error_alerts", {})
|
||||
|
||||
DAILY_REPORT_ENABLED = daily_report_config.get("enabled", False)
|
||||
DAILY_REPORT_PRIORITY = daily_report_config.get("priority", 0)
|
||||
ERROR_ALERTS_ENABLED = error_alerts_config.get("enabled", False)
|
||||
ERROR_ALERTS_PRIORITY = error_alerts_config.get("priority", 1)
|
||||
|
||||
print(f"✓ Configuration loaded successfully")
|
||||
print(f" User Key: {USER_KEY[:10]}... (hidden)" if USER_KEY else " User Key: (not set)")
|
||||
print(f" API Token: {API_TOKEN[:10]}... (hidden)" if API_TOKEN else " API Token: (not set)")
|
||||
print(f" Daily Reports: {'Enabled' if DAILY_REPORT_ENABLED else 'Disabled'} (priority: {DAILY_REPORT_PRIORITY})")
|
||||
print(f" Error Alerts: {'Enabled' if ERROR_ALERTS_ENABLED else 'Disabled'} (priority: {ERROR_ALERTS_PRIORITY})")
|
||||
print()
|
||||
|
||||
if not USER_KEY or not API_TOKEN:
|
||||
print("✗ Pushover credentials not configured!")
|
||||
print()
|
||||
print("Please add the following to your secrets.yaml:")
|
||||
print(" PUSHOVER_USER_KEY: your-user-key-here")
|
||||
print(" PUSHOVER_API_TOKEN: your-app-token-here")
|
||||
print()
|
||||
print("Get your credentials from https://pushover.net")
|
||||
sys.exit(1)
|
||||
|
||||
except Exception as e:
|
||||
print(f"✗ Failed to load configuration: {e}")
|
||||
print()
|
||||
print("Make sure you have:")
|
||||
print("1. config.yaml with pushover section")
|
||||
print("2. secrets.yaml with PUSHOVER_USER_KEY and PUSHOVER_API_TOKEN")
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
def test_simple_notification() -> bool:
|
||||
"""Test sending a simple notification."""
|
||||
print("Test 1: Sending simple test notification...")
|
||||
try:
|
||||
api = PushoverAPI(API_TOKEN)
|
||||
api.send_message(
|
||||
user=USER_KEY,
|
||||
title="Pushover Test",
|
||||
message=f"Test notification from AlpineBits server at {datetime.now().strftime('%H:%M:%S')}",
|
||||
)
|
||||
print("✓ Simple notification sent successfully")
|
||||
print(" Check your Pushover device for the notification!")
|
||||
return True
|
||||
except Exception as e:
|
||||
print(f"✗ Failed to send notification: {e}")
|
||||
return False
|
||||
|
||||
|
||||
def test_priority_levels() -> bool:
|
||||
"""Test different priority levels."""
|
||||
print("\nTest 2: Testing priority levels...")
|
||||
|
||||
priorities = [
|
||||
(-2, "Lowest", "No alert, quiet notification"),
|
||||
(-1, "Low", "No alert"),
|
||||
(0, "Normal", "Standard notification"),
|
||||
(1, "High", "Bypasses quiet hours"),
|
||||
]
|
||||
|
||||
success_count = 0
|
||||
for i, (priority, name, description) in enumerate(priorities):
|
||||
try:
|
||||
api = PushoverAPI(API_TOKEN)
|
||||
api.send_message(
|
||||
user=USER_KEY,
|
||||
title=f"Priority Test: {name}",
|
||||
message=f"Testing priority {priority} - {description}",
|
||||
priority=priority,
|
||||
)
|
||||
print(f"✓ Sent notification with priority {priority} ({name})")
|
||||
success_count += 1
|
||||
|
||||
# Add delay between notifications to avoid rate limiting (except after last one)
|
||||
if i < len(priorities) - 1:
|
||||
time.sleep(1)
|
||||
except Exception as e:
|
||||
print(f"✗ Failed to send priority {priority} notification: {e}")
|
||||
|
||||
print(f" {success_count}/{len(priorities)} priority notifications sent")
|
||||
return success_count == len(priorities)
|
||||
|
||||
|
||||
def test_daily_report_format() -> bool:
|
||||
"""Test sending a message formatted like a daily report."""
|
||||
print("\nTest 3: Testing daily report format...")
|
||||
|
||||
# Sample stats similar to what the app would generate
|
||||
date_str = datetime.now().strftime("%Y-%m-%d")
|
||||
stats = {
|
||||
"reporting_period": {
|
||||
"start": "2025-10-15 08:00:00",
|
||||
"end": "2025-10-16 08:00:00",
|
||||
},
|
||||
"total_reservations": 12,
|
||||
"hotels": [
|
||||
{"hotel_name": "Bemelmans Post", "reservations": 5},
|
||||
{"hotel_name": "Jagthof Kaltern", "reservations": 4},
|
||||
{"hotel_name": "Residence Erika", "reservations": 3},
|
||||
],
|
||||
}
|
||||
|
||||
# Build message similar to pushover_service.py
|
||||
lines = [f"Report for {date_str}", ""]
|
||||
|
||||
period = stats.get("reporting_period", {})
|
||||
if period:
|
||||
start = period.get("start", "").split(" ")[1] if " " in period.get("start", "") else ""
|
||||
end = period.get("end", "").split(" ")[1] if " " in period.get("end", "") else ""
|
||||
if start and end:
|
||||
lines.append(f"Period: {start} - {end}")
|
||||
|
||||
total = stats.get("total_reservations", 0)
|
||||
lines.append(f"Total Reservations: {total}")
|
||||
|
||||
hotels = stats.get("hotels", [])
|
||||
if hotels:
|
||||
lines.append("")
|
||||
lines.append("By Hotel:")
|
||||
for hotel in hotels[:5]:
|
||||
hotel_name = hotel.get("hotel_name", "Unknown")
|
||||
count = hotel.get("reservations", 0)
|
||||
if len(hotel_name) > 20:
|
||||
hotel_name = hotel_name[:17] + "..."
|
||||
lines.append(f" • {hotel_name}: {count}")
|
||||
|
||||
message = "\n".join(lines)
|
||||
|
||||
try:
|
||||
api = PushoverAPI(API_TOKEN)
|
||||
api.send_message(
|
||||
user=USER_KEY,
|
||||
title=f"AlpineBits Daily Report - {date_str}",
|
||||
message=message,
|
||||
priority=DAILY_REPORT_PRIORITY,
|
||||
)
|
||||
print("✓ Daily report format notification sent successfully")
|
||||
print(f" Message preview:\n{message}")
|
||||
return True
|
||||
except Exception as e:
|
||||
print(f"✗ Failed to send daily report notification: {e}")
|
||||
return False
|
||||
|
||||
|
||||
def test_error_alert_format() -> bool:
|
||||
"""Test sending a message formatted like an error alert."""
|
||||
print("\nTest 4: Testing error alert format...")
|
||||
|
||||
error_count = 3
|
||||
title = f"🚨 AlpineBits Error Alert: {error_count} errors"
|
||||
|
||||
message = f"""Error Alert - {datetime.now().strftime('%Y-%m-%d %H:%M:%S')}
|
||||
|
||||
Alert Type: Test Alert
|
||||
Error Count: {error_count}
|
||||
Time Range: 14:30:00 to 14:45:00
|
||||
|
||||
Sample errors (see logs for details):
|
||||
1. Database connection timeout
|
||||
2. SMTP connection failed
|
||||
3. API rate limit exceeded"""
|
||||
|
||||
try:
|
||||
api = PushoverAPI(API_TOKEN)
|
||||
api.send_message(
|
||||
user=USER_KEY,
|
||||
title=title,
|
||||
message=message,
|
||||
priority=ERROR_ALERTS_PRIORITY,
|
||||
)
|
||||
print("✓ Error alert format notification sent successfully")
|
||||
return True
|
||||
except Exception as e:
|
||||
print(f"✗ Failed to send error alert notification: {e}")
|
||||
return False
|
||||
|
||||
|
||||
def test_with_url() -> bool:
|
||||
"""Test notification with supplementary URL."""
|
||||
print("\nTest 5: Testing notification with URL...")
|
||||
|
||||
try:
|
||||
api = PushoverAPI(API_TOKEN)
|
||||
api.send_message(
|
||||
user=USER_KEY,
|
||||
title="AlpineBits Server",
|
||||
message="This notification includes a supplementary URL. Tap to open.",
|
||||
url="https://github.com/anthropics/claude-code",
|
||||
url_title="View on GitHub",
|
||||
)
|
||||
print("✓ Notification with URL sent successfully")
|
||||
return True
|
||||
except Exception as e:
|
||||
print(f"✗ Failed to send notification with URL: {e}")
|
||||
return False
|
||||
|
||||
|
||||
def validate_credentials() -> bool:
|
||||
"""Validate Pushover credentials."""
|
||||
print("Test 0: Validating Pushover credentials...")
|
||||
try:
|
||||
api = PushoverAPI(API_TOKEN)
|
||||
# Try to validate the user key
|
||||
response = api.send_message(
|
||||
user=USER_KEY,
|
||||
title="Credential Validation",
|
||||
message="If you receive this, your Pushover credentials are valid!",
|
||||
)
|
||||
print("✓ Credentials validated successfully")
|
||||
return True
|
||||
except Exception as e:
|
||||
print(f"✗ Credential validation failed: {e}")
|
||||
print()
|
||||
print("Possible issues:")
|
||||
print("1. Invalid API token (check your application settings)")
|
||||
print("2. Invalid user key (check your user dashboard)")
|
||||
print("3. Network connectivity issues")
|
||||
print("4. Pushover service is down")
|
||||
return False
|
||||
|
||||
|
||||
def main():
|
||||
"""Run all Pushover tests."""
|
||||
print("=" * 70)
|
||||
print("Pushover Push Notification Test Script")
|
||||
print("=" * 70)
|
||||
print()
|
||||
|
||||
# First validate credentials
|
||||
if not validate_credentials():
|
||||
print("\n" + "=" * 70)
|
||||
print("FAILED: Cannot proceed without valid credentials")
|
||||
print("=" * 70)
|
||||
return 1
|
||||
|
||||
print()
|
||||
|
||||
# Run all tests
|
||||
tests = [
|
||||
("Simple Notification", test_simple_notification),
|
||||
("Priority Levels", test_priority_levels),
|
||||
("Daily Report Format", test_daily_report_format),
|
||||
("Error Alert Format", test_error_alert_format),
|
||||
("Notification with URL", test_with_url),
|
||||
]
|
||||
|
||||
results = []
|
||||
for i, (test_name, test_func) in enumerate(tests):
|
||||
try:
|
||||
result = test_func()
|
||||
results.append((test_name, result))
|
||||
|
||||
# Add delay between tests to avoid rate limiting (except after last one)
|
||||
if i < len(tests) - 1:
|
||||
print(" (Waiting 1 second to avoid rate limiting...)")
|
||||
time.sleep(1)
|
||||
except Exception as e:
|
||||
print(f"✗ Test '{test_name}' crashed: {e}")
|
||||
results.append((test_name, False))
|
||||
|
||||
# Print summary
|
||||
print("\n" + "=" * 70)
|
||||
print("TEST SUMMARY")
|
||||
print("=" * 70)
|
||||
|
||||
passed = sum(1 for _, result in results if result)
|
||||
total = len(results)
|
||||
|
||||
for test_name, result in results:
|
||||
status = "✓ PASS" if result else "✗ FAIL"
|
||||
print(f" {status}: {test_name}")
|
||||
|
||||
print()
|
||||
print(f"Results: {passed}/{total} tests passed")
|
||||
|
||||
if passed == total:
|
||||
print("\n✓ ALL TESTS PASSED!")
|
||||
print("=" * 70)
|
||||
print("\nYour Pushover configuration is working correctly.")
|
||||
print("Check your Pushover device for all the test notifications.")
|
||||
return 0
|
||||
else:
|
||||
print(f"\n⚠ {total - passed} TEST(S) FAILED")
|
||||
print("=" * 70)
|
||||
print("\nSome tests failed. Check the output above for details.")
|
||||
return 1
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
try:
|
||||
sys.exit(main())
|
||||
except KeyboardInterrupt:
|
||||
print("\n\nTest cancelled by user")
|
||||
sys.exit(1)
|
||||
294
test_smtp.py
Normal file
294
test_smtp.py
Normal file
@@ -0,0 +1,294 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Test script to diagnose SMTP connection issues.
|
||||
|
||||
This script tests SMTP connectivity with different configurations to help
|
||||
identify whether the issue is with credentials, network, ports, or TLS settings.
|
||||
"""
|
||||
|
||||
import smtplib
|
||||
import ssl
|
||||
import sys
|
||||
from datetime import datetime
|
||||
from email.mime.multipart import MIMEMultipart
|
||||
from email.mime.text import MIMEText
|
||||
|
||||
# Load configuration from config.yaml
|
||||
try:
|
||||
from alpine_bits_python.config_loader import load_config
|
||||
|
||||
print("Loading configuration from config.yaml...")
|
||||
config = load_config()
|
||||
email_config = config.get("email", {})
|
||||
smtp_config = email_config.get("smtp", {})
|
||||
|
||||
SMTP_HOST = smtp_config.get("host", "smtp.titan.email")
|
||||
SMTP_PORT = smtp_config.get("port", 465)
|
||||
SMTP_USERNAME = smtp_config.get("username", "")
|
||||
SMTP_PASSWORD = smtp_config.get("password", "")
|
||||
USE_TLS = smtp_config.get("use_tls", False)
|
||||
USE_SSL = smtp_config.get("use_ssl", True)
|
||||
FROM_ADDRESS = email_config.get("from_address", "info@99tales.net")
|
||||
FROM_NAME = email_config.get("from_name", "AlpineBits Monitor")
|
||||
|
||||
# Get test recipient
|
||||
monitoring_config = email_config.get("monitoring", {})
|
||||
daily_report = monitoring_config.get("daily_report", {})
|
||||
recipients = daily_report.get("recipients", [])
|
||||
TEST_RECIPIENT = recipients[0] if recipients else "jonas@vaius.ai"
|
||||
|
||||
print(f"✓ Configuration loaded successfully")
|
||||
print(f" SMTP Host: {SMTP_HOST}")
|
||||
print(f" SMTP Port: {SMTP_PORT}")
|
||||
print(f" Username: {SMTP_USERNAME}")
|
||||
print(f" Password: {'***' if SMTP_PASSWORD else '(not set)'}")
|
||||
print(f" Use SSL: {USE_SSL}")
|
||||
print(f" Use TLS: {USE_TLS}")
|
||||
print(f" From: {FROM_ADDRESS}")
|
||||
print(f" Test Recipient: {TEST_RECIPIENT}")
|
||||
print()
|
||||
|
||||
except Exception as e:
|
||||
print(f"✗ Failed to load configuration: {e}")
|
||||
print("Using default values for testing...")
|
||||
SMTP_HOST = "smtp.titan.email"
|
||||
SMTP_PORT = 465
|
||||
SMTP_USERNAME = input("Enter SMTP username: ")
|
||||
SMTP_PASSWORD = input("Enter SMTP password: ")
|
||||
USE_TLS = False
|
||||
USE_SSL = True
|
||||
FROM_ADDRESS = "info@99tales.net"
|
||||
FROM_NAME = "AlpineBits Monitor"
|
||||
TEST_RECIPIENT = input("Enter test recipient email: ")
|
||||
print()
|
||||
|
||||
|
||||
def create_test_message(subject: str) -> MIMEMultipart:
|
||||
"""Create a test email message."""
|
||||
msg = MIMEMultipart("alternative")
|
||||
msg["Subject"] = subject
|
||||
msg["From"] = f"{FROM_NAME} <{FROM_ADDRESS}>"
|
||||
msg["To"] = TEST_RECIPIENT
|
||||
msg["Date"] = datetime.now().strftime("%a, %d %b %Y %H:%M:%S %z")
|
||||
|
||||
body = f"""SMTP Connection Test - {datetime.now().strftime("%Y-%m-%d %H:%M:%S")}
|
||||
|
||||
This is a test email to verify SMTP connectivity.
|
||||
|
||||
Configuration:
|
||||
- SMTP Host: {SMTP_HOST}
|
||||
- SMTP Port: {SMTP_PORT}
|
||||
- Use SSL: {USE_SSL}
|
||||
- Use TLS: {USE_TLS}
|
||||
|
||||
If you received this email, the SMTP configuration is working correctly!
|
||||
"""
|
||||
|
||||
msg.attach(MIMEText(body, "plain"))
|
||||
return msg
|
||||
|
||||
|
||||
def test_smtp_connection(host: str, port: int, timeout: int = 10) -> bool:
|
||||
"""Test basic TCP connection to SMTP server."""
|
||||
import socket
|
||||
|
||||
print(f"Test 1: Testing TCP connection to {host}:{port}...")
|
||||
try:
|
||||
sock = socket.create_connection((host, port), timeout=timeout)
|
||||
sock.close()
|
||||
print(f"✓ TCP connection successful to {host}:{port}")
|
||||
return True
|
||||
except socket.timeout:
|
||||
print(f"✗ Connection timed out after {timeout} seconds")
|
||||
print(f" This suggests a network/firewall issue blocking access to {host}:{port}")
|
||||
return False
|
||||
except socket.error as e:
|
||||
print(f"✗ Connection failed: {e}")
|
||||
return False
|
||||
|
||||
|
||||
def test_smtp_ssl(host: str, port: int, username: str, password: str, timeout: int = 30) -> bool:
|
||||
"""Test SMTP connection with SSL."""
|
||||
print(f"\nTest 2: Testing SMTP with SSL (port {port})...")
|
||||
try:
|
||||
context = ssl.create_default_context()
|
||||
with smtplib.SMTP_SSL(host, port, timeout=timeout, context=context) as server:
|
||||
print(f"✓ Connected to SMTP server with SSL")
|
||||
|
||||
# Try to get server info
|
||||
server.ehlo()
|
||||
print(f"✓ EHLO successful")
|
||||
|
||||
# Try authentication if credentials provided
|
||||
if username and password:
|
||||
print(f" Attempting authentication as: {username}")
|
||||
server.login(username, password)
|
||||
print(f"✓ Authentication successful")
|
||||
else:
|
||||
print(f"⚠ No credentials provided, skipping authentication")
|
||||
|
||||
return True
|
||||
|
||||
except smtplib.SMTPAuthenticationError as e:
|
||||
print(f"✗ Authentication failed: {e}")
|
||||
print(f" Check your username and password")
|
||||
return False
|
||||
except socket.timeout:
|
||||
print(f"✗ Connection timed out after {timeout} seconds")
|
||||
print(f" Try increasing timeout or check network/firewall")
|
||||
return False
|
||||
except Exception as e:
|
||||
print(f"✗ SMTP SSL failed: {e}")
|
||||
return False
|
||||
|
||||
|
||||
def test_smtp_tls(host: str, port: int, username: str, password: str, timeout: int = 30) -> bool:
|
||||
"""Test SMTP connection with STARTTLS."""
|
||||
print(f"\nTest 3: Testing SMTP with STARTTLS (port {port})...")
|
||||
try:
|
||||
with smtplib.SMTP(host, port, timeout=timeout) as server:
|
||||
print(f"✓ Connected to SMTP server")
|
||||
|
||||
# Try STARTTLS
|
||||
context = ssl.create_default_context()
|
||||
server.starttls(context=context)
|
||||
print(f"✓ STARTTLS successful")
|
||||
|
||||
# Try authentication if credentials provided
|
||||
if username and password:
|
||||
print(f" Attempting authentication as: {username}")
|
||||
server.login(username, password)
|
||||
print(f"✓ Authentication successful")
|
||||
else:
|
||||
print(f"⚠ No credentials provided, skipping authentication")
|
||||
|
||||
return True
|
||||
|
||||
except smtplib.SMTPAuthenticationError as e:
|
||||
print(f"✗ Authentication failed: {e}")
|
||||
return False
|
||||
except socket.timeout:
|
||||
print(f"✗ Connection timed out after {timeout} seconds")
|
||||
return False
|
||||
except Exception as e:
|
||||
print(f"✗ SMTP TLS failed: {e}")
|
||||
return False
|
||||
|
||||
|
||||
def send_test_email(host: str, port: int, username: str, password: str,
|
||||
use_ssl: bool, use_tls: bool, timeout: int = 30) -> bool:
|
||||
"""Send an actual test email."""
|
||||
print(f"\nTest 4: Sending test email...")
|
||||
try:
|
||||
msg = create_test_message("SMTP Test Email - AlpineBits")
|
||||
|
||||
if use_ssl:
|
||||
context = ssl.create_default_context()
|
||||
with smtplib.SMTP_SSL(host, port, timeout=timeout, context=context) as server:
|
||||
if username and password:
|
||||
server.login(username, password)
|
||||
server.send_message(msg, FROM_ADDRESS, [TEST_RECIPIENT])
|
||||
|
||||
else:
|
||||
with smtplib.SMTP(host, port, timeout=timeout) as server:
|
||||
if use_tls:
|
||||
context = ssl.create_default_context()
|
||||
server.starttls(context=context)
|
||||
|
||||
if username and password:
|
||||
server.login(username, password)
|
||||
|
||||
server.send_message(msg, FROM_ADDRESS, [TEST_RECIPIENT])
|
||||
|
||||
print(f"✓ Test email sent successfully to {TEST_RECIPIENT}")
|
||||
print(f" Check your inbox!")
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
print(f"✗ Failed to send email: {e}")
|
||||
return False
|
||||
|
||||
|
||||
def main():
|
||||
"""Run all SMTP tests."""
|
||||
print("=" * 70)
|
||||
print("SMTP Connection Test Script")
|
||||
print("=" * 70)
|
||||
print()
|
||||
|
||||
# Test 1: Basic TCP connection
|
||||
tcp_ok = test_smtp_connection(SMTP_HOST, SMTP_PORT, timeout=10)
|
||||
|
||||
if not tcp_ok:
|
||||
print("\n" + "=" * 70)
|
||||
print("DIAGNOSIS: Cannot establish TCP connection to SMTP server")
|
||||
print("=" * 70)
|
||||
print("\nPossible causes:")
|
||||
print("1. The SMTP server is down or unreachable")
|
||||
print("2. A firewall is blocking the connection")
|
||||
print("3. The host or port is incorrect")
|
||||
print("4. Network connectivity issues from your container/server")
|
||||
print("\nTroubleshooting:")
|
||||
print(f"- Verify the server is correct: {SMTP_HOST}")
|
||||
print(f"- Verify the port is correct: {SMTP_PORT}")
|
||||
print("- Check if your container/server has outbound internet access")
|
||||
print("- Try from a different network or machine")
|
||||
print(f"- Use telnet/nc to test: telnet {SMTP_HOST} {SMTP_PORT}")
|
||||
return 1
|
||||
|
||||
# Test 2 & 3: Try both SSL and TLS
|
||||
ssl_ok = False
|
||||
tls_ok = False
|
||||
|
||||
if USE_SSL:
|
||||
ssl_ok = test_smtp_ssl(SMTP_HOST, SMTP_PORT, SMTP_USERNAME, SMTP_PASSWORD, timeout=30)
|
||||
|
||||
# Also try common alternative ports
|
||||
if not ssl_ok and SMTP_PORT == 465:
|
||||
print("\n⚠ Port 465 failed, trying port 587 with STARTTLS...")
|
||||
tls_ok = test_smtp_tls(SMTP_HOST, 587, SMTP_USERNAME, SMTP_PASSWORD, timeout=30)
|
||||
|
||||
if USE_TLS:
|
||||
tls_ok = test_smtp_tls(SMTP_HOST, SMTP_PORT, SMTP_USERNAME, SMTP_PASSWORD, timeout=30)
|
||||
|
||||
if not ssl_ok and not tls_ok:
|
||||
print("\n" + "=" * 70)
|
||||
print("DIAGNOSIS: Cannot authenticate or establish secure connection")
|
||||
print("=" * 70)
|
||||
print("\nPossible causes:")
|
||||
print("1. Wrong username or password")
|
||||
print("2. Wrong port for the encryption method")
|
||||
print("3. SSL/TLS version mismatch")
|
||||
print("\nTroubleshooting:")
|
||||
print("- Verify your credentials are correct")
|
||||
print("- Port 465 typically uses SSL")
|
||||
print("- Port 587 typically uses STARTTLS")
|
||||
print("- Port 25 is usually unencrypted (not recommended)")
|
||||
return 1
|
||||
|
||||
# Test 4: Send actual email
|
||||
send_ok = send_test_email(
|
||||
SMTP_HOST, SMTP_PORT, SMTP_USERNAME, SMTP_PASSWORD,
|
||||
USE_SSL, USE_TLS, timeout=30
|
||||
)
|
||||
|
||||
print("\n" + "=" * 70)
|
||||
if send_ok:
|
||||
print("✓ ALL TESTS PASSED!")
|
||||
print("=" * 70)
|
||||
print("\nYour SMTP configuration is working correctly.")
|
||||
print(f"Check {TEST_RECIPIENT} for the test email.")
|
||||
else:
|
||||
print("⚠ PARTIAL SUCCESS")
|
||||
print("=" * 70)
|
||||
print("\nConnection and authentication work, but email sending failed.")
|
||||
print("This might be a temporary issue. Try again.")
|
||||
|
||||
return 0
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
try:
|
||||
sys.exit(main())
|
||||
except KeyboardInterrupt:
|
||||
print("\n\nTest cancelled by user")
|
||||
sys.exit(1)
|
||||
@@ -16,14 +16,12 @@ from xsdata_pydantic.bindings import XmlParser, XmlSerializer
|
||||
|
||||
from alpine_bits_python.alpine_bits_helpers import create_res_retrieve_response
|
||||
from alpine_bits_python.alpinebits_server import AlpineBitsClientInfo, AlpineBitsServer
|
||||
from alpine_bits_python.const import HttpStatusCode
|
||||
from alpine_bits_python.db import AckedRequest, Base, Customer, Reservation
|
||||
from alpine_bits_python.generated import OtaReadRq
|
||||
from alpine_bits_python.generated.alpinebits import OtaResRetrieveRs
|
||||
from alpine_bits_python.schemas import ReservationData
|
||||
|
||||
# HTTP status code constants
|
||||
HTTP_OK = 200
|
||||
|
||||
|
||||
@pytest_asyncio.fixture
|
||||
async def test_db_engine():
|
||||
@@ -66,7 +64,7 @@ def sample_customer():
|
||||
contact_id="CONTACT-12345",
|
||||
name_prefix="Mr.",
|
||||
name_title="Jr.",
|
||||
email_address="john.doe@example.com",
|
||||
email_address="john.doe@gmail.com",
|
||||
phone="+1234567890",
|
||||
email_newsletter=True,
|
||||
address_line="123 Main Street",
|
||||
@@ -378,7 +376,7 @@ class TestXMLParsing:
|
||||
# Verify customer data is present
|
||||
assert "John" in xml_output
|
||||
assert "Doe" in xml_output
|
||||
assert "john.doe@example.com" in xml_output
|
||||
assert "john.doe@gmail.com" in xml_output
|
||||
|
||||
# Verify reservation data is present
|
||||
# assert "RES-2024-001" in xml_output
|
||||
@@ -558,7 +556,7 @@ class TestAcknowledgments:
|
||||
)
|
||||
|
||||
assert response is not None
|
||||
assert response.status_code == HTTP_OK
|
||||
assert response.status_code == HttpStatusCode.OK
|
||||
assert response.xml_content is not None
|
||||
|
||||
# Verify response contains reservation data
|
||||
@@ -609,7 +607,7 @@ class TestAcknowledgments:
|
||||
)
|
||||
|
||||
assert ack_response is not None
|
||||
assert ack_response.status_code == HTTP_OK
|
||||
assert ack_response.status_code == HttpStatusCode.OK
|
||||
assert "OTA_NotifReportRS" in ack_response.xml_content
|
||||
|
||||
@pytest.mark.asyncio
|
||||
@@ -693,6 +691,7 @@ class TestAcknowledgments:
|
||||
acked_request = AckedRequest(
|
||||
unique_id=md5_hash,
|
||||
client_id=client_info.client_id,
|
||||
username=client_info.username,
|
||||
timestamp=datetime.now(UTC),
|
||||
)
|
||||
populated_db_session.add(acked_request)
|
||||
@@ -776,6 +775,7 @@ class TestAcknowledgments:
|
||||
acked_request = AckedRequest(
|
||||
unique_id=md5_hash,
|
||||
client_id=client_info.client_id,
|
||||
username=client_info.username,
|
||||
timestamp=datetime.now(UTC),
|
||||
)
|
||||
populated_db_session.add(acked_request)
|
||||
@@ -920,7 +920,7 @@ class TestAcknowledgments:
|
||||
)
|
||||
|
||||
assert response is not None
|
||||
assert response.status_code == HTTP_OK
|
||||
assert response.status_code == HttpStatusCode.OK
|
||||
|
||||
# Parse response to verify both reservations are returned
|
||||
parser = XmlParser()
|
||||
|
||||
@@ -4,6 +4,7 @@ import pytest
|
||||
from xsdata_pydantic.bindings import XmlParser
|
||||
|
||||
from alpine_bits_python.alpinebits_server import AlpineBitsClientInfo, AlpineBitsServer
|
||||
from alpine_bits_python.const import HttpStatusCode
|
||||
from alpine_bits_python.generated.alpinebits import OtaPingRs
|
||||
|
||||
|
||||
@@ -60,7 +61,7 @@ async def test_ping_action_response_success():
|
||||
client_info=client_info,
|
||||
version="2024-10",
|
||||
)
|
||||
assert response.status_code == 200
|
||||
assert response.status_code == HttpStatusCode.OK
|
||||
assert "<OTA_PingRS" in response.xml_content
|
||||
assert "<Success" in response.xml_content
|
||||
assert "Version=" in response.xml_content
|
||||
@@ -78,7 +79,7 @@ async def test_ping_action_response_version_arbitrary():
|
||||
client_info=client_info,
|
||||
version="2022-10",
|
||||
)
|
||||
assert response.status_code == 200
|
||||
assert response.status_code == HttpStatusCode.OK
|
||||
assert "<OTA_PingRS" in response.xml_content
|
||||
assert "Version=" in response.xml_content
|
||||
|
||||
|
||||
@@ -11,6 +11,7 @@ This module tests all FastAPI endpoints including:
|
||||
|
||||
import base64
|
||||
import gzip
|
||||
import json
|
||||
import uuid
|
||||
from pathlib import Path
|
||||
from unittest.mock import patch
|
||||
@@ -21,6 +22,7 @@ from fastapi.testclient import TestClient
|
||||
from sqlalchemy.ext.asyncio import AsyncSession, async_sessionmaker, create_async_engine
|
||||
|
||||
from alpine_bits_python.api import app
|
||||
from alpine_bits_python.const import HttpStatusCode
|
||||
from alpine_bits_python.db import Base, Customer, Reservation
|
||||
|
||||
|
||||
@@ -130,7 +132,7 @@ def sample_wix_form_data():
|
||||
"contactId": f"contact-{unique_id}",
|
||||
},
|
||||
"field:anrede": "Mr.",
|
||||
"field:form_field_5a7b": "Checked",
|
||||
"field:form_field_5a7b": True,
|
||||
"field:date_picker_a7c8": "2024-12-25",
|
||||
"field:date_picker_7e65": "2024-12-31",
|
||||
"field:number_7cf5": "2",
|
||||
@@ -159,7 +161,7 @@ class TestHealthEndpoints:
|
||||
def test_root_endpoint(self, client):
|
||||
"""Test GET / returns health status."""
|
||||
response = client.get("/api/")
|
||||
assert response.status_code == 200
|
||||
assert response.status_code == HttpStatusCode.OK
|
||||
data = response.json()
|
||||
assert data["message"] == "Wix Form Handler API is running"
|
||||
assert "timestamp" in data
|
||||
@@ -169,7 +171,7 @@ class TestHealthEndpoints:
|
||||
def test_health_check_endpoint(self, client):
|
||||
"""Test GET /api/health returns healthy status."""
|
||||
response = client.get("/api/health")
|
||||
assert response.status_code == 200
|
||||
assert response.status_code == HttpStatusCode.OK
|
||||
data = response.json()
|
||||
assert data["status"] == "healthy"
|
||||
assert data["service"] == "wix-form-handler"
|
||||
@@ -179,7 +181,7 @@ class TestHealthEndpoints:
|
||||
def test_landing_page(self, client):
|
||||
"""Test GET / (landing page) returns HTML."""
|
||||
response = client.get("/")
|
||||
assert response.status_code == 200
|
||||
assert response.status_code == HttpStatusCode.OK
|
||||
assert "text/html" in response.headers["content-type"]
|
||||
assert "99tales" in response.text or "Construction" in response.text
|
||||
|
||||
@@ -191,18 +193,17 @@ class TestWixWebhookEndpoint:
|
||||
"""Test successful Wix form submission."""
|
||||
response = client.post("/api/webhook/wix-form", json=sample_wix_form_data)
|
||||
|
||||
assert response.status_code == 200
|
||||
assert response.status_code == HttpStatusCode.OK
|
||||
data = response.json()
|
||||
assert data["status"] == "success"
|
||||
assert "timestamp" in data
|
||||
assert "data_logged_to" in data
|
||||
|
||||
def test_wix_webhook_creates_customer_and_reservation(
|
||||
self, client, sample_wix_form_data
|
||||
):
|
||||
"""Test that webhook creates customer and reservation in database."""
|
||||
response = client.post("/api/webhook/wix-form", json=sample_wix_form_data)
|
||||
assert response.status_code == 200
|
||||
assert response.status_code == HttpStatusCode.OK
|
||||
|
||||
# Verify data was saved to database
|
||||
# Use the client's app state engine, not a separate test_db_engine
|
||||
@@ -252,14 +253,37 @@ class TestWixWebhookEndpoint:
|
||||
}
|
||||
|
||||
response = client.post("/api/webhook/wix-form", json=minimal_data)
|
||||
assert response.status_code == 200
|
||||
assert response.status_code == HttpStatusCode.OK
|
||||
data = response.json()
|
||||
assert data["status"] == "success"
|
||||
|
||||
def test_wix_webhook_test_endpoint(self, client, sample_wix_form_data):
|
||||
"""Test the test endpoint works identically."""
|
||||
response = client.post("/api/webhook/wix-form/test", json=sample_wix_form_data)
|
||||
assert response.status_code == 200
|
||||
assert response.status_code == HttpStatusCode.OK
|
||||
data = response.json()
|
||||
assert data["status"] == "success"
|
||||
|
||||
@pytest.mark.parametrize(
|
||||
"test_form_file",
|
||||
[
|
||||
Path(__file__).parent / "test_data" / f"test_form{i}.json"
|
||||
for i in range(1, 6)
|
||||
],
|
||||
ids=lambda p: p.name,
|
||||
)
|
||||
def test_wix_webhook_test_endpoint_with_test_forms(self, client, test_form_file):
|
||||
"""Test the test endpoint works with all test form data files."""
|
||||
# Skip if file doesn't exist
|
||||
if not test_form_file.exists():
|
||||
pytest.skip(f"{test_form_file.name} not found")
|
||||
|
||||
# Load test form data
|
||||
with test_form_file.open() as f:
|
||||
form_data = json.load(f)
|
||||
|
||||
response = client.post("/api/webhook/wix-form/test", json=form_data)
|
||||
assert response.status_code == HttpStatusCode.OK
|
||||
data = response.json()
|
||||
assert data["status"] == "success"
|
||||
|
||||
@@ -286,7 +310,7 @@ class TestWixWebhookEndpoint:
|
||||
}
|
||||
|
||||
response = client.post("/api/webhook/wix-form", json=first_submission)
|
||||
assert response.status_code == 200
|
||||
assert response.status_code == HttpStatusCode.OK
|
||||
|
||||
# Second submission with same contact_id but different data
|
||||
second_submission = {
|
||||
@@ -311,7 +335,7 @@ class TestWixWebhookEndpoint:
|
||||
}
|
||||
|
||||
response = client.post("/api/webhook/wix-form", json=second_submission)
|
||||
assert response.status_code == 200
|
||||
assert response.status_code == HttpStatusCode.OK
|
||||
|
||||
# Verify only one customer exists with updated information
|
||||
async def check_db():
|
||||
@@ -320,7 +344,6 @@ class TestWixWebhookEndpoint:
|
||||
engine = client.app.state.engine
|
||||
async_session = async_sessionmaker(engine, expire_on_commit=False)
|
||||
async with async_session() as session:
|
||||
|
||||
# Check only one customer exists
|
||||
result = await session.execute(select(Customer))
|
||||
customers = result.scalars().all()
|
||||
@@ -330,9 +353,9 @@ class TestWixWebhookEndpoint:
|
||||
# Verify customer was updated with new information
|
||||
assert customer.given_name == "John"
|
||||
assert customer.surname == "Smith", "Last name updated"
|
||||
assert (
|
||||
customer.email_address == "john.smith@example.com"
|
||||
), "Email updated"
|
||||
assert customer.email_address == "john.smith@example.com", (
|
||||
"Email updated"
|
||||
)
|
||||
assert customer.phone == "+9876543210", "Phone updated"
|
||||
assert customer.name_prefix == "Dr.", "Prefix updated"
|
||||
assert customer.language == "de", "Language updated"
|
||||
@@ -354,38 +377,241 @@ class TestWixWebhookEndpoint:
|
||||
class TestGenericWebhookEndpoint:
|
||||
"""Test generic webhook endpoint."""
|
||||
|
||||
def test_generic_webhook_success(self, client):
|
||||
"""Test successful generic webhook submission."""
|
||||
def test_generic_webhook_success_with_real_data(self, client):
|
||||
"""Test successful generic webhook submission with real form data."""
|
||||
unique_id = uuid.uuid4().hex[:8]
|
||||
test_data = {
|
||||
"event_type": "test_event",
|
||||
"data": {
|
||||
"key1": "value1",
|
||||
"key2": "value2",
|
||||
"nested": {"foo": "bar"},
|
||||
"hotel_data": {"hotelname": "Bemelmans", "hotelcode": "39054_001"},
|
||||
"form_data": {
|
||||
"sprache": "it",
|
||||
"anreise": "14.10.2025",
|
||||
"abreise": "15.10.2025",
|
||||
"erwachsene": "1",
|
||||
"kinder": "2",
|
||||
"alter": {"1": "2", "2": "4"},
|
||||
"anrede": "Herr",
|
||||
"name": "Armin",
|
||||
"nachname": "Wieser",
|
||||
"mail": f"test.{unique_id}@example.com",
|
||||
"tel": "+391234567890",
|
||||
"nachricht": "Test message",
|
||||
},
|
||||
"metadata": {"source": "test_system"},
|
||||
"tracking_data": {
|
||||
"utm_source": "ig",
|
||||
"utm_medium": "Instagram_Stories",
|
||||
"utm_campaign": "Conversions_Apartment_Bemelmans_ITA",
|
||||
"utm_content": "Grafik_1_Apartments_Bemelmans",
|
||||
"utm_term": "Cold_Traffic_Conversions_Apartment_Bemelmans_ITA",
|
||||
"fbclid": "test_fbclid_123",
|
||||
},
|
||||
"timestamp": "2025-10-14T12:20:08+02:00",
|
||||
}
|
||||
|
||||
response = client.post("/api/webhook/generic", json=test_data)
|
||||
|
||||
assert response.status_code == 200
|
||||
assert response.status_code == HttpStatusCode.OK
|
||||
data = response.json()
|
||||
assert data["status"] == "success"
|
||||
assert "timestamp" in data
|
||||
assert "data_logged_to" in data
|
||||
assert "generic_webhooks" in data["data_logged_to"]
|
||||
assert data["note"] == "Data logged for later analysis"
|
||||
assert (
|
||||
data["message"]
|
||||
== "Generic webhook data received and processed successfully"
|
||||
)
|
||||
|
||||
def test_generic_webhook_creates_customer_and_reservation(self, client):
|
||||
"""Test that webhook creates customer and reservation in database."""
|
||||
unique_id = uuid.uuid4().hex[:8]
|
||||
test_data = {
|
||||
"hotel_data": {"hotelname": "Test Hotel", "hotelcode": "TEST123"},
|
||||
"form_data": {
|
||||
"sprache": "de",
|
||||
"anreise": "25.12.2025",
|
||||
"abreise": "31.12.2025",
|
||||
"erwachsene": "2",
|
||||
"kinder": "1",
|
||||
"alter": {"1": "8"},
|
||||
"anrede": "Frau",
|
||||
"name": "Maria",
|
||||
"nachname": "Schmidt",
|
||||
"mail": f"maria.{unique_id}@example.com",
|
||||
"tel": "+491234567890",
|
||||
"nachricht": "Looking forward to our stay",
|
||||
},
|
||||
"tracking_data": {
|
||||
"utm_source": "google",
|
||||
"utm_medium": "cpc",
|
||||
"utm_campaign": "winter2025",
|
||||
},
|
||||
"timestamp": "2025-10-14T10:00:00Z",
|
||||
}
|
||||
|
||||
response = client.post("/api/webhook/generic", json=test_data)
|
||||
assert response.status_code == HttpStatusCode.OK
|
||||
|
||||
# Verify data was saved to database
|
||||
async def check_db():
|
||||
engine = client.app.state.engine
|
||||
async_session = async_sessionmaker(engine, expire_on_commit=False)
|
||||
async with async_session() as session:
|
||||
from sqlalchemy import select
|
||||
|
||||
# Check customer was created
|
||||
result = await session.execute(select(Customer))
|
||||
customers = result.scalars().all()
|
||||
# Find the customer we just created
|
||||
customer = next(
|
||||
(
|
||||
c
|
||||
for c in customers
|
||||
if c.email_address == f"maria.{unique_id}@example.com"
|
||||
),
|
||||
None,
|
||||
)
|
||||
assert customer is not None, "Customer should be created"
|
||||
assert customer.given_name == "Maria"
|
||||
assert customer.surname == "Schmidt"
|
||||
assert customer.phone == "+491234567890"
|
||||
assert customer.language == "de"
|
||||
assert customer.name_prefix == "Frau"
|
||||
|
||||
# Check reservation was created
|
||||
result = await session.execute(select(Reservation))
|
||||
reservations = result.scalars().all()
|
||||
reservation = next(
|
||||
(r for r in reservations if r.customer_id == customer.id), None
|
||||
)
|
||||
assert reservation is not None, "Reservation should be created"
|
||||
assert reservation.hotel_code == "TEST123"
|
||||
assert reservation.hotel_name == "Test Hotel"
|
||||
assert reservation.num_adults == 2
|
||||
assert reservation.num_children == 1
|
||||
# children_ages is stored as CSV string
|
||||
children_ages = [
|
||||
int(age) for age in reservation.children_ages.split(",") if age
|
||||
]
|
||||
assert len(children_ages) == 1
|
||||
assert children_ages[0] == 8
|
||||
assert reservation.utm_source == "google"
|
||||
assert reservation.utm_campaign == "winter2025"
|
||||
|
||||
import asyncio
|
||||
|
||||
asyncio.run(check_db())
|
||||
|
||||
def test_generic_webhook_missing_dates(self, client):
|
||||
"""Test webhook with missing required dates."""
|
||||
test_data = {
|
||||
"hotel_data": {"hotelname": "Test", "hotelcode": "123"},
|
||||
"form_data": {
|
||||
"sprache": "de",
|
||||
"name": "John",
|
||||
"nachname": "Doe",
|
||||
"mail": "john@example.com",
|
||||
# Missing anreise and abreise
|
||||
},
|
||||
"tracking_data": {},
|
||||
}
|
||||
|
||||
response = client.post("/api/webhook/generic", json=test_data)
|
||||
# HTTPException with 400 is raised, then caught and returns 500
|
||||
assert response.status_code in [400, 500]
|
||||
|
||||
def test_generic_webhook_invalid_date_format(self, client):
|
||||
"""Test webhook with invalid date format."""
|
||||
test_data = {
|
||||
"hotel_data": {"hotelname": "Test", "hotelcode": "123"},
|
||||
"form_data": {
|
||||
"sprache": "en",
|
||||
"anreise": "2025-10-14", # Wrong format, should be DD.MM.YYYY
|
||||
"abreise": "2025-10-15",
|
||||
"erwachsene": "2",
|
||||
"kinder": "0",
|
||||
"name": "Jane",
|
||||
"nachname": "Doe",
|
||||
"mail": "jane@example.com",
|
||||
},
|
||||
"tracking_data": {},
|
||||
}
|
||||
|
||||
response = client.post("/api/webhook/generic", json=test_data)
|
||||
# HTTPException with 400 is raised, then caught and returns 500
|
||||
assert response.status_code in [400, 500]
|
||||
|
||||
def test_generic_webhook_with_children_ages(self, client):
|
||||
"""Test webhook properly handles children ages."""
|
||||
unique_id = uuid.uuid4().hex[:8]
|
||||
test_data = {
|
||||
"hotel_data": {"hotelname": "Family Hotel", "hotelcode": "FAM001"},
|
||||
"form_data": {
|
||||
"sprache": "it",
|
||||
"anreise": "01.08.2025",
|
||||
"abreise": "15.08.2025",
|
||||
"erwachsene": "2",
|
||||
"kinder": "3",
|
||||
"alter": {"1": "5", "2": "8", "3": "12"},
|
||||
"anrede": "--", # Should be filtered out
|
||||
"name": "Paolo",
|
||||
"nachname": "Rossi",
|
||||
"mail": f"paolo.{unique_id}@example.com",
|
||||
"tel": "", # Empty phone
|
||||
"nachricht": "",
|
||||
},
|
||||
"tracking_data": {"fbclid": "test_fb_123", "gclid": "test_gc_456"},
|
||||
}
|
||||
|
||||
response = client.post("/api/webhook/generic", json=test_data)
|
||||
assert response.status_code == HttpStatusCode.OK
|
||||
|
||||
# Verify children ages were stored correctly
|
||||
async def check_db():
|
||||
engine = client.app.state.engine
|
||||
async_session = async_sessionmaker(engine, expire_on_commit=False)
|
||||
async with async_session() as session:
|
||||
from sqlalchemy import select
|
||||
|
||||
result = await session.execute(select(Reservation))
|
||||
reservations = result.scalars().all()
|
||||
reservation = next(
|
||||
(r for r in reservations if r.hotel_code == "FAM001"), None
|
||||
)
|
||||
assert reservation is not None
|
||||
assert reservation.num_children == 3
|
||||
# children_ages is stored as CSV string
|
||||
children_ages = [
|
||||
int(age) for age in reservation.children_ages.split(",") if age
|
||||
]
|
||||
assert children_ages == [5, 8, 12]
|
||||
assert reservation.fbclid == "test_fb_123"
|
||||
assert reservation.gclid == "test_gc_456"
|
||||
|
||||
# Check customer with empty phone and -- prefix
|
||||
result = await session.execute(select(Customer))
|
||||
customers = result.scalars().all()
|
||||
customer = next(
|
||||
(
|
||||
c
|
||||
for c in customers
|
||||
if c.email_address == f"paolo.{unique_id}@example.com"
|
||||
),
|
||||
None,
|
||||
)
|
||||
assert customer is not None
|
||||
assert customer.phone is None # Empty phone should be None
|
||||
assert customer.name_prefix is None # -- should be filtered out
|
||||
|
||||
import asyncio
|
||||
|
||||
asyncio.run(check_db())
|
||||
|
||||
def test_generic_webhook_empty_payload(self, client):
|
||||
"""Test generic webhook with empty payload."""
|
||||
response = client.post("/api/webhook/generic", json={})
|
||||
|
||||
assert response.status_code == 200
|
||||
data = response.json()
|
||||
assert data["status"] == "success"
|
||||
# Should fail gracefully with error logging (400 or 500)
|
||||
assert response.status_code in [400, 500]
|
||||
|
||||
def test_generic_webhook_complex_nested_data(self, client):
|
||||
"""Test generic webhook with complex nested data structures."""
|
||||
"""Test generic webhook logs complex nested data structures."""
|
||||
complex_data = {
|
||||
"arrays": [1, 2, 3],
|
||||
"nested": {"level1": {"level2": {"level3": "deep"}}},
|
||||
@@ -394,9 +620,8 @@ class TestGenericWebhookEndpoint:
|
||||
|
||||
response = client.post("/api/webhook/generic", json=complex_data)
|
||||
|
||||
assert response.status_code == 200
|
||||
data = response.json()
|
||||
assert data["status"] == "success"
|
||||
# This should fail to process but succeed in logging (400 or 500)
|
||||
assert response.status_code in [400, 500]
|
||||
|
||||
|
||||
class TestAlpineBitsServerEndpoint:
|
||||
@@ -425,7 +650,7 @@ class TestAlpineBitsServerEndpoint:
|
||||
headers=headers,
|
||||
)
|
||||
|
||||
assert response.status_code == 200
|
||||
assert response.status_code == HttpStatusCode.OK
|
||||
assert "OTA_PingRS" in response.text
|
||||
assert "application/xml" in response.headers["content-type"]
|
||||
assert "X-AlpineBits-Server-Version" in response.headers
|
||||
@@ -436,7 +661,7 @@ class TestAlpineBitsServerEndpoint:
|
||||
|
||||
response = client.post("/api/alpinebits/server-2024-10", data=form_data)
|
||||
|
||||
assert response.status_code == 401
|
||||
assert response.status_code == HttpStatusCode.UNAUTHORIZED
|
||||
|
||||
def test_alpinebits_invalid_credentials(self, client):
|
||||
"""Test AlpineBits endpoint with invalid credentials."""
|
||||
@@ -449,7 +674,7 @@ class TestAlpineBitsServerEndpoint:
|
||||
"/api/alpinebits/server-2024-10", data=form_data, headers=headers
|
||||
)
|
||||
|
||||
assert response.status_code == 401
|
||||
assert response.status_code == HttpStatusCode.UNAUTHORIZED
|
||||
|
||||
def test_alpinebits_missing_action(self, client, basic_auth_headers):
|
||||
"""Test AlpineBits endpoint without action parameter."""
|
||||
@@ -488,7 +713,7 @@ class TestAlpineBitsServerEndpoint:
|
||||
headers=headers,
|
||||
)
|
||||
|
||||
assert response.status_code == 200
|
||||
assert response.status_code == HttpStatusCode.OK
|
||||
assert "OTA_PingRS" in response.text
|
||||
|
||||
|
||||
@@ -512,7 +737,7 @@ class TestXMLUploadEndpoint:
|
||||
headers={**basic_auth_headers, "Content-Type": "application/xml"},
|
||||
)
|
||||
|
||||
assert response.status_code == 200
|
||||
assert response.status_code == HttpStatusCode.OK
|
||||
assert "Xml received" in response.text
|
||||
|
||||
def test_xml_upload_gzip_compressed(self, client, basic_auth_headers):
|
||||
@@ -536,7 +761,7 @@ class TestXMLUploadEndpoint:
|
||||
headers=headers,
|
||||
)
|
||||
|
||||
assert response.status_code == 200
|
||||
assert response.status_code == HttpStatusCode.OK
|
||||
|
||||
def test_xml_upload_missing_auth(self, client):
|
||||
"""Test XML upload without authentication."""
|
||||
@@ -545,7 +770,7 @@ class TestXMLUploadEndpoint:
|
||||
content=b"<xml/>",
|
||||
)
|
||||
|
||||
assert response.status_code == 401
|
||||
assert response.status_code == HttpStatusCode.UNAUTHORIZED
|
||||
|
||||
def test_xml_upload_invalid_path(self, client, basic_auth_headers):
|
||||
"""Test XML upload with path traversal attempt.
|
||||
@@ -602,7 +827,7 @@ class TestAuthentication:
|
||||
)
|
||||
|
||||
# Should not be 401
|
||||
assert response.status_code != 401
|
||||
assert response.status_code != HttpStatusCode.UNAUTHORIZED
|
||||
|
||||
def test_basic_auth_missing_credentials(self, client):
|
||||
"""Test basic auth with missing credentials."""
|
||||
@@ -611,7 +836,7 @@ class TestAuthentication:
|
||||
data={"action": "OTA_Ping:Handshaking"},
|
||||
)
|
||||
|
||||
assert response.status_code == 401
|
||||
assert response.status_code == HttpStatusCode.UNAUTHORIZED
|
||||
|
||||
def test_basic_auth_malformed_header(self, client):
|
||||
"""Test basic auth with malformed Authorization header."""
|
||||
@@ -630,15 +855,13 @@ class TestAuthentication:
|
||||
class TestEventDispatcher:
|
||||
"""Test event dispatcher and push notifications."""
|
||||
|
||||
def test_form_submission_triggers_event(
|
||||
self, client, sample_wix_form_data
|
||||
):
|
||||
def test_form_submission_triggers_event(self, client, sample_wix_form_data):
|
||||
"""Test that form submission triggers event dispatcher."""
|
||||
# Just verify the endpoint works with the event dispatcher
|
||||
# The async task runs in background and doesn't affect response
|
||||
response = client.post("/api/webhook/wix-form", json=sample_wix_form_data)
|
||||
|
||||
assert response.status_code == 200
|
||||
assert response.status_code == HttpStatusCode.OK
|
||||
# Event dispatcher is tested separately in its own test suite
|
||||
|
||||
|
||||
@@ -701,7 +924,7 @@ class TestCORS:
|
||||
|
||||
# TestClient returns 400 for OPTIONS requests
|
||||
# In production, CORS middleware handles preflight correctly
|
||||
assert response.status_code in [200, 400, 405]
|
||||
assert response.status_code in [HttpStatusCode.OK, 400, 405]
|
||||
|
||||
|
||||
class TestRateLimiting:
|
||||
@@ -716,7 +939,7 @@ class TestRateLimiting:
|
||||
responses.append(response.status_code)
|
||||
|
||||
# All should succeed if under limit
|
||||
assert all(status == 200 for status in responses)
|
||||
assert all(status == HttpStatusCode.OK for status in responses)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
|
||||
317
tests/test_customer_service.py
Normal file
317
tests/test_customer_service.py
Normal file
@@ -0,0 +1,317 @@
|
||||
"""Tests for CustomerService functionality."""
|
||||
|
||||
import pytest
|
||||
import pytest_asyncio
|
||||
from sqlalchemy import select
|
||||
from sqlalchemy.ext.asyncio import AsyncSession, async_sessionmaker, create_async_engine
|
||||
|
||||
from alpine_bits_python.customer_service import CustomerService
|
||||
from alpine_bits_python.db import Base, Customer, HashedCustomer
|
||||
|
||||
|
||||
@pytest_asyncio.fixture
|
||||
async def async_session():
|
||||
"""Create an async session for testing."""
|
||||
engine = create_async_engine("sqlite+aiosqlite:///:memory:", echo=False)
|
||||
async with engine.begin() as conn:
|
||||
await conn.run_sync(Base.metadata.create_all)
|
||||
|
||||
async_session_maker = async_sessionmaker(engine, expire_on_commit=False)
|
||||
async with async_session_maker() as session:
|
||||
yield session
|
||||
|
||||
await engine.dispose()
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_create_customer_creates_hashed_version(async_session: AsyncSession):
|
||||
"""Test that creating a customer automatically creates hashed version."""
|
||||
service = CustomerService(async_session)
|
||||
|
||||
customer_data = {
|
||||
"given_name": "John",
|
||||
"surname": "Doe",
|
||||
"contact_id": "test123",
|
||||
"email_address": "john@example.com",
|
||||
"phone": "+1234567890",
|
||||
}
|
||||
|
||||
customer = await service.create_customer(customer_data)
|
||||
|
||||
assert customer.id is not None
|
||||
assert customer.given_name == "John"
|
||||
|
||||
# Check that hashed version was created
|
||||
hashed = await service.get_hashed_customer(customer.id)
|
||||
assert hashed is not None
|
||||
assert hashed.customer_id == customer.id
|
||||
assert hashed.hashed_email is not None
|
||||
assert hashed.hashed_phone is not None
|
||||
assert hashed.hashed_given_name is not None
|
||||
assert hashed.hashed_surname is not None
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_update_customer_updates_hashed_version(async_session: AsyncSession):
|
||||
"""Test that updating a customer updates the hashed version."""
|
||||
service = CustomerService(async_session)
|
||||
|
||||
# Create initial customer
|
||||
customer_data = {
|
||||
"given_name": "John",
|
||||
"surname": "Doe",
|
||||
"contact_id": "test123",
|
||||
"email_address": "john@example.com",
|
||||
}
|
||||
customer = await service.create_customer(customer_data)
|
||||
|
||||
# Get initial hashed email
|
||||
hashed = await service.get_hashed_customer(customer.id)
|
||||
original_hashed_email = hashed.hashed_email
|
||||
|
||||
# Update customer email
|
||||
update_data = {"email_address": "newemail@example.com"}
|
||||
updated_customer = await service.update_customer(customer, update_data)
|
||||
|
||||
# Check that hashed version was updated
|
||||
updated_hashed = await service.get_hashed_customer(updated_customer.id)
|
||||
assert updated_hashed.hashed_email != original_hashed_email
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_get_or_create_customer_creates_new(async_session: AsyncSession):
|
||||
"""Test get_or_create creates new customer when not found."""
|
||||
service = CustomerService(async_session)
|
||||
|
||||
customer_data = {
|
||||
"given_name": "Jane",
|
||||
"surname": "Smith",
|
||||
"contact_id": "new123",
|
||||
"email_address": "jane@example.com",
|
||||
}
|
||||
|
||||
customer = await service.get_or_create_customer(customer_data)
|
||||
assert customer.id is not None
|
||||
assert customer.contact_id == "new123"
|
||||
|
||||
# Verify hashed version exists
|
||||
hashed = await service.get_hashed_customer(customer.id)
|
||||
assert hashed is not None
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_get_or_create_customer_updates_existing(async_session: AsyncSession):
|
||||
"""Test get_or_create updates existing customer when found."""
|
||||
service = CustomerService(async_session)
|
||||
|
||||
# Create initial customer
|
||||
customer_data = {
|
||||
"given_name": "Jane",
|
||||
"surname": "Smith",
|
||||
"contact_id": "existing123",
|
||||
"email_address": "jane@example.com",
|
||||
}
|
||||
original_customer = await service.create_customer(customer_data)
|
||||
|
||||
# Try to create again with same contact_id but different data
|
||||
updated_data = {
|
||||
"given_name": "Janet",
|
||||
"surname": "Smith",
|
||||
"contact_id": "existing123",
|
||||
"email_address": "janet@example.com",
|
||||
}
|
||||
customer = await service.get_or_create_customer(updated_data)
|
||||
|
||||
# Should be same customer ID but updated data
|
||||
assert customer.id == original_customer.id
|
||||
assert customer.given_name == "Janet"
|
||||
assert customer.email_address == "janet@example.com"
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_hash_existing_customers_backfills(async_session: AsyncSession):
|
||||
"""Test that hash_existing_customers backfills missing hashed data."""
|
||||
# Create a customer directly in DB without using service
|
||||
customer = Customer(
|
||||
given_name="Bob",
|
||||
surname="Builder",
|
||||
contact_id="bob123",
|
||||
email_address="bob@example.com",
|
||||
phone="+9876543210",
|
||||
)
|
||||
async_session.add(customer)
|
||||
await async_session.commit()
|
||||
await async_session.refresh(customer)
|
||||
|
||||
# Verify no hashed version exists
|
||||
result = await async_session.execute(
|
||||
select(HashedCustomer).where(HashedCustomer.customer_id == customer.id)
|
||||
)
|
||||
hashed = result.scalar_one_or_none()
|
||||
assert hashed is None
|
||||
|
||||
# Run backfill
|
||||
service = CustomerService(async_session)
|
||||
count = await service.hash_existing_customers()
|
||||
|
||||
assert count == 1
|
||||
|
||||
# Verify hashed version now exists
|
||||
result = await async_session.execute(
|
||||
select(HashedCustomer).where(HashedCustomer.customer_id == customer.id)
|
||||
)
|
||||
hashed = result.scalar_one_or_none()
|
||||
assert hashed is not None
|
||||
assert hashed.hashed_email is not None
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_hash_existing_customers_skips_already_hashed(
|
||||
async_session: AsyncSession,
|
||||
):
|
||||
"""Test that hash_existing_customers skips customers already hashed."""
|
||||
service = CustomerService(async_session)
|
||||
|
||||
# Create customer with service (creates hashed version)
|
||||
customer_data = {
|
||||
"given_name": "Alice",
|
||||
"surname": "Wonder",
|
||||
"contact_id": "alice123",
|
||||
"email_address": "alice@example.com",
|
||||
}
|
||||
await service.create_customer(customer_data)
|
||||
|
||||
# Run backfill - should return 0 since customer is already hashed
|
||||
count = await service.hash_existing_customers()
|
||||
assert count == 0
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_hashing_normalization(async_session: AsyncSession):
|
||||
"""Test that hashing properly normalizes data."""
|
||||
service = CustomerService(async_session)
|
||||
|
||||
# Create customer with mixed case and spaces
|
||||
customer_data = {
|
||||
"given_name": " John ",
|
||||
"surname": "DOE",
|
||||
"contact_id": "test123",
|
||||
"email_address": " John.Doe@Example.COM ",
|
||||
"phone": "+1 (234) 567-8900",
|
||||
}
|
||||
customer = await service.create_customer(customer_data)
|
||||
|
||||
hashed = await service.get_hashed_customer(customer.id)
|
||||
|
||||
# Verify hashes exist (normalization should have occurred)
|
||||
assert hashed.hashed_email is not None
|
||||
assert hashed.hashed_phone is not None
|
||||
|
||||
# Hash should be consistent for same normalized value
|
||||
from alpine_bits_python.db import Customer as CustomerModel
|
||||
|
||||
normalized_email_hash = CustomerModel._normalize_and_hash(
|
||||
"john.doe@example.com"
|
||||
)
|
||||
assert hashed.hashed_email == normalized_email_hash
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_hash_existing_customers_normalizes_country_code(
|
||||
async_session: AsyncSession,
|
||||
):
|
||||
"""Test that hash_existing_customers normalizes invalid country codes.
|
||||
|
||||
Specifically tests the case where a customer was saved with a full country
|
||||
name (e.g., "Italy") instead of the ISO 3166-1 alpha-2 code (e.g., "IT").
|
||||
The validation should convert "Italy" to "IT", which is then hashed.
|
||||
"""
|
||||
# Create a customer directly in DB with invalid country code "Italy"
|
||||
# This simulates a customer that was saved before validation was implemented
|
||||
customer = Customer(
|
||||
given_name="Marco",
|
||||
surname="Rossi",
|
||||
contact_id="marco123",
|
||||
email_address="marco@example.com",
|
||||
phone="+39123456789",
|
||||
country_code="Italy", # Invalid - should be "IT"
|
||||
city_name="Rome",
|
||||
postal_code="00100",
|
||||
)
|
||||
async_session.add(customer)
|
||||
await async_session.commit()
|
||||
await async_session.refresh(customer)
|
||||
|
||||
# Verify no hashed version exists yet
|
||||
result = await async_session.execute(
|
||||
select(HashedCustomer).where(HashedCustomer.customer_id == customer.id)
|
||||
)
|
||||
hashed = result.scalar_one_or_none()
|
||||
assert hashed is None
|
||||
|
||||
# Verify the customer has the invalid country code stored in the DB
|
||||
assert customer.country_code == "Italy"
|
||||
|
||||
# Run hash_existing_customers - this should normalize "Italy" to "IT"
|
||||
# during validation and successfully create a hashed customer
|
||||
service = CustomerService(async_session)
|
||||
count = await service.hash_existing_customers()
|
||||
|
||||
# Should successfully hash this customer (country code normalized during validation)
|
||||
assert count == 1
|
||||
|
||||
# Verify hashed version was created
|
||||
await async_session.refresh(customer)
|
||||
result = await async_session.execute(
|
||||
select(HashedCustomer).where(HashedCustomer.customer_id == customer.id)
|
||||
)
|
||||
hashed = result.scalar_one_or_none()
|
||||
assert hashed is not None
|
||||
|
||||
# The hashed customer should have the hashed version of normalized country code "IT"
|
||||
# "IT" -> lowercase "it" -> sha256 hash
|
||||
expected_hash = "2ad8a7049d7c5511ac254f5f51fe70a046ebd884729056f0fe57f5160d467153"
|
||||
assert hashed.hashed_country_code == expected_hash
|
||||
# Note: The original customer's country_code in DB remains "Italy" unchanged
|
||||
|
||||
# Now let's test the case where we have a valid 2-letter code
|
||||
# but in the wrong case (should be normalized to uppercase)
|
||||
customer2 = Customer(
|
||||
given_name="Maria",
|
||||
surname="Bianchi",
|
||||
contact_id="maria123",
|
||||
email_address="maria@example.com",
|
||||
phone="+39987654321",
|
||||
country_code="it", # Valid but lowercase - should be normalized to "IT"
|
||||
city_name="Milan",
|
||||
postal_code="20100",
|
||||
)
|
||||
async_session.add(customer2)
|
||||
await async_session.commit()
|
||||
await async_session.refresh(customer2)
|
||||
|
||||
# Run hash_existing_customers again
|
||||
count = await service.hash_existing_customers()
|
||||
|
||||
# Should hash only the second customer (first one already hashed)
|
||||
assert count == 1
|
||||
|
||||
# Verify the customer's country_code was normalized to uppercase
|
||||
await async_session.refresh(customer2)
|
||||
assert customer2.country_code == "IT"
|
||||
|
||||
# Verify hashed version was created with correct hash
|
||||
result = await async_session.execute(
|
||||
select(HashedCustomer).where(HashedCustomer.customer_id == customer2.id)
|
||||
)
|
||||
hashed = result.scalar_one_or_none()
|
||||
assert hashed is not None
|
||||
|
||||
# Verify the hashed country code matches the expected hash
|
||||
# "IT" -> lowercase "it" -> sha256 hash
|
||||
expected_hash = "2ad8a7049d7c5511ac254f5f51fe70a046ebd884729056f0fe57f5160d467153"
|
||||
assert hashed.hashed_country_code == expected_hash
|
||||
|
||||
# For comparison, verify that "italy" would produce a different hash
|
||||
italy_hash = "93ff074ca77b5f7e61e63320b615081149b818863599189b0e356ec3889d51f7"
|
||||
assert hashed.hashed_country_code != italy_hash
|
||||
@@ -1,241 +1,243 @@
|
||||
{"data": {
|
||||
"formName": "Contact us",
|
||||
"submissions": [
|
||||
{
|
||||
"data": {
|
||||
"formName": "Contact us",
|
||||
"submissions": [
|
||||
{
|
||||
"label": "Angebot auswählen",
|
||||
"value": "Zimmer: Doppelzimmer"
|
||||
},
|
||||
{
|
||||
"label": "Anreisedatum",
|
||||
"value": "2025-10-21"
|
||||
},
|
||||
{
|
||||
"label": "Abreisedatum",
|
||||
"value": "2025-12-28"
|
||||
},
|
||||
{
|
||||
"label": "Anzahl Erwachsene",
|
||||
"value": "4"
|
||||
},
|
||||
{
|
||||
"label": "Anzahl Kinder",
|
||||
"value": "0"
|
||||
},
|
||||
{
|
||||
"label": "Anrede",
|
||||
"value": "Herr"
|
||||
},
|
||||
{
|
||||
"label": "Vorname",
|
||||
"value": "Jonas"
|
||||
},
|
||||
{
|
||||
"label": "Nachname",
|
||||
"value": "Linter"
|
||||
},
|
||||
{
|
||||
"label": "Email",
|
||||
"value": "jonas@vaius.ai"
|
||||
},
|
||||
{
|
||||
"label": "Phone",
|
||||
"value": "+39 392 007 6982"
|
||||
},
|
||||
{
|
||||
"label": "Message",
|
||||
"value": "Hallo nachricht in der Kommentarsection"
|
||||
},
|
||||
{
|
||||
"label": "Einwilligung Marketing",
|
||||
"value": "Angekreuzt"
|
||||
},
|
||||
{
|
||||
"label": "utm_Source",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "utm_Medium",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "utm_Campaign",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "utm_Term",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "utm_Content",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "utm_term_id",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "utm_content_id",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "gad_source",
|
||||
"value": "5"
|
||||
},
|
||||
{
|
||||
"label": "gad_campaignid",
|
||||
"value": "23065043477"
|
||||
},
|
||||
{
|
||||
"label": "gbraid",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "gclid",
|
||||
"value": "EAIaIQobChMI-d7Bn_-OkAMVuZJQBh09uD0vEAAYASAAEgKR8_D_BwE"
|
||||
},
|
||||
{
|
||||
"label": "fbclid",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "hotelid",
|
||||
"value": "12345"
|
||||
},
|
||||
{
|
||||
"label": "hotelname",
|
||||
"value": "Bemelmans Post"
|
||||
}
|
||||
],
|
||||
"field:date_picker_7e65": "2025-10-28",
|
||||
"field:number_7cf5": "2",
|
||||
"field:utm_source": "",
|
||||
"submissionTime": "2025-10-06T07:05:34.001Z",
|
||||
"field:gad_source": "5",
|
||||
"field:form_field_5a7b": true,
|
||||
"field:gad_campaignid": "23065043477",
|
||||
"field:utm_medium": "",
|
||||
"field:utm_term_id": "",
|
||||
"context": {
|
||||
"metaSiteId": "1dea821c-8168-4736-96e4-4b92e8b364cf",
|
||||
"activationId": "fd8e9c90-0335-4fd2-976d-985f065f3f80"
|
||||
},
|
||||
"field:email_5139": "jonas@vaius.ai",
|
||||
"field:phone_4c77": "+39 392 007 6982",
|
||||
"_context": {
|
||||
"activation": {
|
||||
"id": "fd8e9c90-0335-4fd2-976d-985f065f3f80"
|
||||
},
|
||||
"configuration": {
|
||||
"id": "a976f18c-fa86-495d-be1e-676df188eeae"
|
||||
},
|
||||
"app": {
|
||||
"id": "225dd912-7dea-4738-8688-4b8c6955ffc2"
|
||||
},
|
||||
"action": {
|
||||
"id": "152db4d7-5263-40c4-be2b-1c81476318b7"
|
||||
},
|
||||
"trigger": {
|
||||
"key": "wix_form_app-form_submitted"
|
||||
}
|
||||
},
|
||||
"field:gclid": "nduaitreuditaor",
|
||||
"formFieldMask": [
|
||||
"field:",
|
||||
"field:",
|
||||
"field:angebot_auswaehlen",
|
||||
"field:date_picker_a7c8",
|
||||
"field:date_picker_7e65",
|
||||
"field:",
|
||||
"field:number_7cf5",
|
||||
"field:anzahl_kinder",
|
||||
"field:alter_kind_3",
|
||||
"field:alter_kind_25",
|
||||
"field:alter_kind_4",
|
||||
"field:alter_kind_5",
|
||||
"field:alter_kind_6",
|
||||
"field:alter_kind_7",
|
||||
"field:alter_kind_8",
|
||||
"field:alter_kind_9",
|
||||
"field:alter_kind_10",
|
||||
"field:alter_kind_11",
|
||||
"field:",
|
||||
"field:anrede",
|
||||
"field:first_name_abae",
|
||||
"field:last_name_d97c",
|
||||
"field:email_5139",
|
||||
"field:phone_4c77",
|
||||
"field:long_answer_3524",
|
||||
"field:form_field_5a7b",
|
||||
"field:",
|
||||
"field:utm_source",
|
||||
"field:utm_medium",
|
||||
"field:utm_campaign",
|
||||
"field:utm_term",
|
||||
"field:utm_content",
|
||||
"field:utm_term_id",
|
||||
"field:utm_content_id",
|
||||
"field:gad_source",
|
||||
"field:gad_campaignid",
|
||||
"field:gbraid",
|
||||
"field:gclid",
|
||||
"field:fbclid",
|
||||
"field:hotelid",
|
||||
"field:hotelname",
|
||||
"field:",
|
||||
"metaSiteId"
|
||||
],
|
||||
"contact": {
|
||||
"name": {
|
||||
"first": "Jonas",
|
||||
"last": "Linter"
|
||||
},
|
||||
"email": "jonas@vaius.ai",
|
||||
"locale": "de-de",
|
||||
"phones": [
|
||||
{
|
||||
"label": "Angebot auswählen",
|
||||
"value": "Zimmer: Doppelzimmer"
|
||||
},
|
||||
{
|
||||
"label": "Anreisedatum",
|
||||
"value": "2025-10-21"
|
||||
},
|
||||
{
|
||||
"label": "Abreisedatum",
|
||||
"value": "2025-12-28"
|
||||
},
|
||||
{
|
||||
"label": "Anzahl Erwachsene",
|
||||
"value": "4"
|
||||
},
|
||||
{
|
||||
"label": "Anzahl Kinder",
|
||||
"value": "0"
|
||||
},
|
||||
{
|
||||
"label": "Anrede",
|
||||
"value": "Herr"
|
||||
},
|
||||
{
|
||||
"label": "Vorname",
|
||||
"value": "Jonas"
|
||||
},
|
||||
{
|
||||
"label": "Nachname",
|
||||
"value": "Linter"
|
||||
},
|
||||
{
|
||||
"label": "Email",
|
||||
"value": "jonas@vaius.ai"
|
||||
},
|
||||
{
|
||||
"label": "Phone",
|
||||
"value": "+39 392 007 6982"
|
||||
},
|
||||
{
|
||||
"label": "Message",
|
||||
"value": "Hallo nachricht in der Kommentarsection"
|
||||
},
|
||||
{
|
||||
"label": "Einwilligung Marketing",
|
||||
"value": "Angekreuzt"
|
||||
},
|
||||
{
|
||||
"label": "utm_Source",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "utm_Medium",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "utm_Campaign",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "utm_Term",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "utm_Content",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "utm_term_id",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "utm_content_id",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "gad_source",
|
||||
"value": "5"
|
||||
},
|
||||
{
|
||||
"label": "gad_campaignid",
|
||||
"value": "23065043477"
|
||||
},
|
||||
{
|
||||
"label": "gbraid",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "gclid",
|
||||
"value": "EAIaIQobChMI-d7Bn_-OkAMVuZJQBh09uD0vEAAYASAAEgKR8_D_BwE"
|
||||
},
|
||||
{
|
||||
"label": "fbclid",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "hotelid",
|
||||
"value": "12345"
|
||||
},
|
||||
{
|
||||
"label": "hotelname",
|
||||
"value": "Bemelmans Post"
|
||||
"tag": "UNTAGGED",
|
||||
"formattedPhone": "+393920076982",
|
||||
"id": "a3bf4-6dbe-4611-8963-a50df805785d",
|
||||
"countryCode": "DE",
|
||||
"e164Phone": "+393920076982",
|
||||
"primary": true,
|
||||
"phone": "392 0076982"
|
||||
}
|
||||
],
|
||||
"field:date_picker_7e65": "2025-10-28",
|
||||
"field:number_7cf5": "2",
|
||||
"field:utm_source": "",
|
||||
"submissionTime": "2025-10-06T07:05:34.001Z",
|
||||
"field:gad_source": "5",
|
||||
"field:form_field_5a7b": "Angekreuzt",
|
||||
"field:gad_campaignid": "23065043477",
|
||||
"field:utm_medium": "",
|
||||
"field:utm_term_id": "",
|
||||
"context": {
|
||||
"metaSiteId": "1dea821c-8168-4736-96e4-4b92e8b364cf",
|
||||
"activationId": "fd8e9c90-0335-4fd2-976d-985f065f3f80"
|
||||
},
|
||||
"field:email_5139": "jonas@vaius.ai",
|
||||
"field:phone_4c77": "+39 392 007 6982",
|
||||
"_context": {
|
||||
"activation": {
|
||||
"id": "fd8e9c90-0335-4fd2-976d-985f065f3f80"
|
||||
},
|
||||
"configuration": {
|
||||
"id": "a976f18c-fa86-495d-be1e-676df188eeae"
|
||||
},
|
||||
"app": {
|
||||
"id": "225dd912-7dea-4738-8688-4b8c6955ffc2"
|
||||
},
|
||||
"action": {
|
||||
"id": "152db4d7-5263-40c4-be2b-1c81476318b7"
|
||||
},
|
||||
"trigger": {
|
||||
"key": "wix_form_app-form_submitted"
|
||||
}
|
||||
},
|
||||
"field:gclid": "nduaitreuditaor",
|
||||
"formFieldMask": [
|
||||
"field:",
|
||||
"field:",
|
||||
"field:angebot_auswaehlen",
|
||||
"field:date_picker_a7c8",
|
||||
"field:date_picker_7e65",
|
||||
"field:",
|
||||
"field:number_7cf5",
|
||||
"field:anzahl_kinder",
|
||||
"field:alter_kind_3",
|
||||
"field:alter_kind_25",
|
||||
"field:alter_kind_4",
|
||||
"field:alter_kind_5",
|
||||
"field:alter_kind_6",
|
||||
"field:alter_kind_7",
|
||||
"field:alter_kind_8",
|
||||
"field:alter_kind_9",
|
||||
"field:alter_kind_10",
|
||||
"field:alter_kind_11",
|
||||
"field:",
|
||||
"field:anrede",
|
||||
"field:first_name_abae",
|
||||
"field:last_name_d97c",
|
||||
"field:email_5139",
|
||||
"field:phone_4c77",
|
||||
"field:long_answer_3524",
|
||||
"field:form_field_5a7b",
|
||||
"field:",
|
||||
"field:utm_source",
|
||||
"field:utm_medium",
|
||||
"field:utm_campaign",
|
||||
"field:utm_term",
|
||||
"field:utm_content",
|
||||
"field:utm_term_id",
|
||||
"field:utm_content_id",
|
||||
"field:gad_source",
|
||||
"field:gad_campaignid",
|
||||
"field:gbraid",
|
||||
"field:gclid",
|
||||
"field:fbclid",
|
||||
"field:hotelid",
|
||||
"field:hotelname",
|
||||
"field:",
|
||||
"metaSiteId"
|
||||
],
|
||||
"contact": {
|
||||
"name": {
|
||||
"first": "Jonas",
|
||||
"last": "Linter"
|
||||
},
|
||||
"email": "jonas@vaius.ai",
|
||||
"locale": "de-de",
|
||||
"phones": [
|
||||
{
|
||||
"tag": "UNTAGGED",
|
||||
"formattedPhone": "+393920076982",
|
||||
"id": "a3bf4-6dbe-4611-8963-a50df805785d",
|
||||
"countryCode": "DE",
|
||||
"e164Phone": "+393920076982",
|
||||
"primary": true,
|
||||
"phone": "392 0076982"
|
||||
}
|
||||
],
|
||||
"contactId": "66659da8-4035-47fe-a66b-6ce461ad290f",
|
||||
"emails": [
|
||||
{
|
||||
"id": "e1d2168e-ca3c-4844-8f93-f2e1b0ae70e3",
|
||||
"tag": "UNTAGGED",
|
||||
"email": "koepper-ed@t-online.de",
|
||||
"primary": true
|
||||
}
|
||||
],
|
||||
"updatedDate": "2025-10-06T07:05:35.675Z",
|
||||
"phone": "+491758555456",
|
||||
"createdDate": "2025-10-06T07:05:35.675Z"
|
||||
},
|
||||
"submissionId": "666247dc-9d5a-4eb7-87a7-677bf64645ad",
|
||||
"field:anzahl_kinder": "0",
|
||||
"field:first_name_abae": "Ernst-Dieter",
|
||||
"field:utm_content_id": "",
|
||||
"field:utm_campaign": "",
|
||||
"field:utm_term": "",
|
||||
"contactId": "66659da8-4035-47fe-a66b-6ce461ad290f",
|
||||
"field:date_picker_a7c8": "2025-12-21",
|
||||
"field:hotelname": "Testhotel",
|
||||
"field:angebot_auswaehlen": "Zimmer: Doppelzimmer",
|
||||
"field:utm_content": "",
|
||||
"field:last_name_d97c": "Linter",
|
||||
"field:hotelid": "135",
|
||||
"submissionsLink": "https://manage.wix.app/forms/submissions/1dea821c-8168-4736-96e4-4b92e8b364cf/e084006b-ae83-4e4d-b2f5-074118cdb3b1?d=https%3A%2F%2Fmanage.wix.com%2Fdashboard%2F1dea821c-8168-4736-96e4-4b92e8b364cf%2Fwix-forms%2Fform%2Fe084006b-ae83-4e4d-b2f5-074118cdb3b1%2Fsubmissions&s=true",
|
||||
"field:gbraid": "",
|
||||
"field:fbclid": "",
|
||||
"submissionPdf": {
|
||||
"fileName": "86d247dc-9d5a-4eb7-87a7-677bf64645ad.pdf",
|
||||
"downloadUrl": "https://manage.wix.com/_api/form-submission-service/v4/submissions/86d247dc-9d5a-4eb7-87a7-677bf64645ad/download?accessToken=JWS.eyJraWQiOiJWLVNuLWhwZSIsImFsZyI6IkhTMjU2In0.eyJkYXRhIjoie1wibWV0YVNpdGVJZFwiOlwiMWRlYTgyMWMtODE2OC00NzM2LTk2ZTQtNGI5MmU4YjM2NGNmXCJ9IiwiaWF0IjoxNzU5NzM0MzM1LCJleHAiOjE3NTk3MzQ5MzV9.9koy-O_ptm0dRspjh01Yefkt2rCHiUlRCFtE_S3auYw"
|
||||
},
|
||||
"field:anrede": "Herr",
|
||||
"field:long_answer_3524": "Kommentarsektion vermutlich",
|
||||
"formId": "e084006b-ae83-4e4d-b2f5-074118cdb3b1"
|
||||
}}
|
||||
"emails": [
|
||||
{
|
||||
"id": "e1d2168e-ca3c-4844-8f93-f2e1b0ae70e3",
|
||||
"tag": "UNTAGGED",
|
||||
"email": "koepper-ed@t-online.de",
|
||||
"primary": true
|
||||
}
|
||||
],
|
||||
"updatedDate": "2025-10-06T07:05:35.675Z",
|
||||
"phone": "+491758555456",
|
||||
"createdDate": "2025-10-06T07:05:35.675Z"
|
||||
},
|
||||
"submissionId": "666247dc-9d5a-4eb7-87a7-677bf64645ad",
|
||||
"field:anzahl_kinder": "0",
|
||||
"field:first_name_abae": "Ernst-Dieter",
|
||||
"field:utm_content_id": "",
|
||||
"field:utm_campaign": "",
|
||||
"field:utm_term": "",
|
||||
"contactId": "66659da8-4035-47fe-a66b-6ce461ad290f",
|
||||
"field:date_picker_a7c8": "2025-12-21",
|
||||
"field:hotelname": "Testhotel",
|
||||
"field:angebot_auswaehlen": "Zimmer: Doppelzimmer",
|
||||
"field:utm_content": "",
|
||||
"field:last_name_d97c": "Linter",
|
||||
"field:hotelid": "135",
|
||||
"submissionsLink": "https://manage.wix.app/forms/submissions/1dea821c-8168-4736-96e4-4b92e8b364cf/e084006b-ae83-4e4d-b2f5-074118cdb3b1?d=https%3A%2F%2Fmanage.wix.com%2Fdashboard%2F1dea821c-8168-4736-96e4-4b92e8b364cf%2Fwix-forms%2Fform%2Fe084006b-ae83-4e4d-b2f5-074118cdb3b1%2Fsubmissions&s=true",
|
||||
"field:gbraid": "",
|
||||
"field:fbclid": "",
|
||||
"submissionPdf": {
|
||||
"fileName": "86d247dc-9d5a-4eb7-87a7-677bf64645ad.pdf",
|
||||
"downloadUrl": "https://manage.wix.com/_api/form-submission-service/v4/submissions/86d247dc-9d5a-4eb7-87a7-677bf64645ad/download?accessToken=JWS.eyJraWQiOiJWLVNuLWhwZSIsImFsZyI6IkhTMjU2In0.eyJkYXRhIjoie1wibWV0YVNpdGVJZFwiOlwiMWRlYTgyMWMtODE2OC00NzM2LTk2ZTQtNGI5MmU4YjM2NGNmXCJ9IiwiaWF0IjoxNzU5NzM0MzM1LCJleHAiOjE3NTk3MzQ5MzV9.9koy-O_ptm0dRspjh01Yefkt2rCHiUlRCFtE_S3auYw"
|
||||
},
|
||||
"field:anrede": "Herr",
|
||||
"field:long_answer_3524": "Kommentarsektion vermutlich",
|
||||
"formId": "e084006b-ae83-4e4d-b2f5-074118cdb3b1"
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,242 +1,243 @@
|
||||
{"data": {
|
||||
"formName": "Reservation Request",
|
||||
"submissions": [
|
||||
{
|
||||
"data": {
|
||||
"formName": "Reservation Request",
|
||||
"submissions": [
|
||||
{
|
||||
"label": "Angebot auswählen",
|
||||
"value": "Zimmer: Einzelzimmer"
|
||||
},
|
||||
{
|
||||
"label": "Anreisedatum",
|
||||
"value": "2025-11-15"
|
||||
},
|
||||
{
|
||||
"label": "Abreisedatum",
|
||||
"value": "2025-11-18"
|
||||
},
|
||||
{
|
||||
"label": "Anzahl Erwachsene",
|
||||
"value": "1"
|
||||
},
|
||||
{
|
||||
"label": "Anzahl Kinder",
|
||||
"value": "1"
|
||||
},
|
||||
{
|
||||
"label": "Anrede",
|
||||
"value": "Frau"
|
||||
},
|
||||
{
|
||||
"label": "Vorname",
|
||||
"value": "Maria"
|
||||
},
|
||||
{
|
||||
"label": "Nachname",
|
||||
"value": "Schmidt"
|
||||
},
|
||||
{
|
||||
"label": "Email",
|
||||
"value": "maria.schmidt@gmail.com"
|
||||
},
|
||||
{
|
||||
"label": "Phone",
|
||||
"value": "+49 173 555 1234"
|
||||
},
|
||||
{
|
||||
"label": "Message",
|
||||
"value": "Benötige ein ruhiges Zimmer, bitte."
|
||||
},
|
||||
{
|
||||
"label": "Einwilligung Marketing",
|
||||
"value": "Selezionato"
|
||||
},
|
||||
{
|
||||
"label": "utm_Source",
|
||||
"value": "google"
|
||||
},
|
||||
{
|
||||
"label": "utm_Medium",
|
||||
"value": "cpc"
|
||||
},
|
||||
{
|
||||
"label": "utm_Campaign",
|
||||
"value": "winter_2025"
|
||||
},
|
||||
{
|
||||
"label": "utm_Term",
|
||||
"value": "hotel_booking"
|
||||
},
|
||||
{
|
||||
"label": "utm_Content",
|
||||
"value": "ad_variant_a"
|
||||
},
|
||||
{
|
||||
"label": "utm_term_id",
|
||||
"value": "12345"
|
||||
},
|
||||
{
|
||||
"label": "utm_content_id",
|
||||
"value": "67890"
|
||||
},
|
||||
{
|
||||
"label": "gad_source",
|
||||
"value": "1"
|
||||
},
|
||||
{
|
||||
"label": "gad_campaignid",
|
||||
"value": "98765432"
|
||||
},
|
||||
{
|
||||
"label": "gbraid",
|
||||
"value": "1.2.abc123def456"
|
||||
},
|
||||
{
|
||||
"label": "gclid",
|
||||
"value": "CjwKCAjw9eWYBhB3EiwAA5J8_xyz123abc"
|
||||
},
|
||||
{
|
||||
"label": "fbclid",
|
||||
"value": "IwAR123fbclid456"
|
||||
},
|
||||
{
|
||||
"label": "hotelid",
|
||||
"value": "135"
|
||||
},
|
||||
{
|
||||
"label": "hotelname",
|
||||
"value": "Frangart Inn"
|
||||
}
|
||||
],
|
||||
"field:date_picker_7e65": "2025-11-18",
|
||||
"field:number_7cf5": "1",
|
||||
"field:utm_source": "google",
|
||||
"submissionTime": "2025-10-06T14:22:15.001Z",
|
||||
"field:gad_source": "1",
|
||||
"field:gad_campaignid": "98765432",
|
||||
"field:utm_medium": "cpc",
|
||||
"field:utm_term_id": "12345",
|
||||
"context": {
|
||||
"metaSiteId": "2ebc832d-9279-5847-a7f5-5c03f9c475d0",
|
||||
"activationId": "0e9a0d91-1446-5fe3-a87e-a96b17f720c1"
|
||||
},
|
||||
"field:email_5139": "maria.schmidt@gmail.com",
|
||||
"field:phone_4c77": "+49 173 555 1234",
|
||||
"_context": {
|
||||
"activation": {
|
||||
"id": "0e9a0d91-1446-5fe3-a87e-a96b17f720c1"
|
||||
},
|
||||
"configuration": {
|
||||
"id": "b087029d-0b97-506e-cf2f-787e0299ffbf"
|
||||
},
|
||||
"app": {
|
||||
"id": "336ee023-8efa-5849-9799-5c9d7066aac3"
|
||||
},
|
||||
"action": {
|
||||
"id": "263ec5e8-6374-51d5-df3c-2d92587429c8"
|
||||
},
|
||||
"trigger": {
|
||||
"key": "wix_form_app-form_submitted"
|
||||
}
|
||||
},
|
||||
"field:gclid": "CjwKCAjw9eWYBhB3EiwAA5J8_xyz123abc",
|
||||
"formFieldMask": [
|
||||
"field:",
|
||||
"field:",
|
||||
"field:angebot_auswaehlen",
|
||||
"field:date_picker_a7c8",
|
||||
"field:date_picker_7e65",
|
||||
"field:",
|
||||
"field:number_7cf5",
|
||||
"field:anzahl_kinder",
|
||||
"field:alter_kind_3",
|
||||
"field:alter_kind_25",
|
||||
"field:alter_kind_4",
|
||||
"field:alter_kind_5",
|
||||
"field:alter_kind_6",
|
||||
"field:alter_kind_7",
|
||||
"field:alter_kind_8",
|
||||
"field:alter_kind_9",
|
||||
"field:alter_kind_10",
|
||||
"field:alter_kind_11",
|
||||
"field:",
|
||||
"field:anrede",
|
||||
"field:first_name_abae",
|
||||
"field:last_name_d97c",
|
||||
"field:email_5139",
|
||||
"field:phone_4c77",
|
||||
"field:long_answer_3524",
|
||||
"field:form_field_5a7b",
|
||||
"field:",
|
||||
"field:utm_source",
|
||||
"field:utm_medium",
|
||||
"field:utm_campaign",
|
||||
"field:utm_term",
|
||||
"field:utm_content",
|
||||
"field:utm_term_id",
|
||||
"field:utm_content_id",
|
||||
"field:gad_source",
|
||||
"field:gad_campaignid",
|
||||
"field:gbraid",
|
||||
"field:gclid",
|
||||
"field:fbclid",
|
||||
"field:hotelid",
|
||||
"field:hotelname",
|
||||
"field:",
|
||||
"metaSiteId"
|
||||
],
|
||||
"contact": {
|
||||
"name": {
|
||||
"first": "Maria",
|
||||
"last": "Schmidt"
|
||||
},
|
||||
"email": "maria.schmidt@gmail.com",
|
||||
"locale": "de-de",
|
||||
"phones": [
|
||||
{
|
||||
"label": "Angebot auswählen",
|
||||
"value": "Zimmer: Einzelzimmer"
|
||||
},
|
||||
{
|
||||
"label": "Anreisedatum",
|
||||
"value": "2025-11-15"
|
||||
},
|
||||
{
|
||||
"label": "Abreisedatum",
|
||||
"value": "2025-11-18"
|
||||
},
|
||||
{
|
||||
"label": "Anzahl Erwachsene",
|
||||
"value": "1"
|
||||
},
|
||||
{
|
||||
"label": "Anzahl Kinder",
|
||||
"value": "1"
|
||||
},
|
||||
{
|
||||
"label": "Anrede",
|
||||
"value": "Frau"
|
||||
},
|
||||
{
|
||||
"label": "Vorname",
|
||||
"value": "Maria"
|
||||
},
|
||||
{
|
||||
"label": "Nachname",
|
||||
"value": "Schmidt"
|
||||
},
|
||||
{
|
||||
"label": "Email",
|
||||
"value": "maria.schmidt@gmail.com"
|
||||
},
|
||||
{
|
||||
"label": "Phone",
|
||||
"value": "+49 173 555 1234"
|
||||
},
|
||||
{
|
||||
"label": "Message",
|
||||
"value": "Benötige ein ruhiges Zimmer, bitte."
|
||||
},
|
||||
{
|
||||
"label": "Einwilligung Marketing",
|
||||
"value": "Selezionato"
|
||||
},
|
||||
{
|
||||
"label": "utm_Source",
|
||||
"value": "google"
|
||||
},
|
||||
{
|
||||
"label": "utm_Medium",
|
||||
"value": "cpc"
|
||||
},
|
||||
{
|
||||
"label": "utm_Campaign",
|
||||
"value": "winter_2025"
|
||||
},
|
||||
{
|
||||
"label": "utm_Term",
|
||||
"value": "hotel_booking"
|
||||
},
|
||||
{
|
||||
"label": "utm_Content",
|
||||
"value": "ad_variant_a"
|
||||
},
|
||||
{
|
||||
"label": "utm_term_id",
|
||||
"value": "12345"
|
||||
},
|
||||
{
|
||||
"label": "utm_content_id",
|
||||
"value": "67890"
|
||||
},
|
||||
{
|
||||
"label": "gad_source",
|
||||
"value": "1"
|
||||
},
|
||||
{
|
||||
"label": "gad_campaignid",
|
||||
"value": "98765432"
|
||||
},
|
||||
{
|
||||
"label": "gbraid",
|
||||
"value": "1.2.abc123def456"
|
||||
},
|
||||
{
|
||||
"label": "gclid",
|
||||
"value": "CjwKCAjw9eWYBhB3EiwAA5J8_xyz123abc"
|
||||
},
|
||||
{
|
||||
"label": "fbclid",
|
||||
"value": "IwAR123fbclid456"
|
||||
},
|
||||
{
|
||||
"label": "hotelid",
|
||||
"value": "135"
|
||||
},
|
||||
{
|
||||
"label": "hotelname",
|
||||
"value": "Frangart Inn"
|
||||
"tag": "UNTAGGED",
|
||||
"formattedPhone": "+49 173 555 1234",
|
||||
"id": "641b4cf5-7ecf-5722-9a74-b61ea916391e",
|
||||
"countryCode": "DE",
|
||||
"e164Phone": "+393920076982",
|
||||
"primary": true,
|
||||
"phone": "173 5551234"
|
||||
}
|
||||
],
|
||||
"field:date_picker_7e65": "2025-11-18",
|
||||
"field:number_7cf5": "1",
|
||||
"field:utm_source": "google",
|
||||
"submissionTime": "2025-10-06T14:22:15.001Z",
|
||||
"field:gad_source": "1",
|
||||
"field:form_field_5a7b": "Selezionato",
|
||||
"field:gad_campaignid": "98765432",
|
||||
"field:utm_medium": "cpc",
|
||||
"field:utm_term_id": "12345",
|
||||
"context": {
|
||||
"metaSiteId": "2ebc832d-9279-5847-a7f5-5c03f9c475d0",
|
||||
"activationId": "0e9a0d91-1446-5fe3-a87e-a96b17f720c1"
|
||||
},
|
||||
"field:email_5139": "maria.schmidt@gmail.com",
|
||||
"field:phone_4c77": "+49 173 555 1234",
|
||||
"_context": {
|
||||
"activation": {
|
||||
"id": "0e9a0d91-1446-5fe3-a87e-a96b17f720c1"
|
||||
},
|
||||
"configuration": {
|
||||
"id": "b087029d-0b97-506e-cf2f-787e0299ffbf"
|
||||
},
|
||||
"app": {
|
||||
"id": "336ee023-8efa-5849-9799-5c9d7066aac3"
|
||||
},
|
||||
"action": {
|
||||
"id": "263ec5e8-6374-51d5-df3c-2d92587429c8"
|
||||
},
|
||||
"trigger": {
|
||||
"key": "wix_form_app-form_submitted"
|
||||
}
|
||||
},
|
||||
"field:gclid": "CjwKCAjw9eWYBhB3EiwAA5J8_xyz123abc",
|
||||
"formFieldMask": [
|
||||
"field:",
|
||||
"field:",
|
||||
"field:angebot_auswaehlen",
|
||||
"field:date_picker_a7c8",
|
||||
"field:date_picker_7e65",
|
||||
"field:",
|
||||
"field:number_7cf5",
|
||||
"field:anzahl_kinder",
|
||||
"field:alter_kind_3",
|
||||
"field:alter_kind_25",
|
||||
"field:alter_kind_4",
|
||||
"field:alter_kind_5",
|
||||
"field:alter_kind_6",
|
||||
"field:alter_kind_7",
|
||||
"field:alter_kind_8",
|
||||
"field:alter_kind_9",
|
||||
"field:alter_kind_10",
|
||||
"field:alter_kind_11",
|
||||
"field:",
|
||||
"field:anrede",
|
||||
"field:first_name_abae",
|
||||
"field:last_name_d97c",
|
||||
"field:email_5139",
|
||||
"field:phone_4c77",
|
||||
"field:long_answer_3524",
|
||||
"field:form_field_5a7b",
|
||||
"field:",
|
||||
"field:utm_source",
|
||||
"field:utm_medium",
|
||||
"field:utm_campaign",
|
||||
"field:utm_term",
|
||||
"field:utm_content",
|
||||
"field:utm_term_id",
|
||||
"field:utm_content_id",
|
||||
"field:gad_source",
|
||||
"field:gad_campaignid",
|
||||
"field:gbraid",
|
||||
"field:gclid",
|
||||
"field:fbclid",
|
||||
"field:hotelid",
|
||||
"field:hotelname",
|
||||
"field:",
|
||||
"metaSiteId"
|
||||
],
|
||||
"contact": {
|
||||
"name": {
|
||||
"first": "Maria",
|
||||
"last": "Schmidt"
|
||||
},
|
||||
"email": "maria.schmidt@gmail.com",
|
||||
"locale": "de-de",
|
||||
"phones": [
|
||||
{
|
||||
"tag": "UNTAGGED",
|
||||
"formattedPhone": "+49 173 555 1234",
|
||||
"id": "641b4cf5-7ecf-5722-9a74-b61ea916391e",
|
||||
"countryCode": "DE",
|
||||
"e164Phone": "+393920076982",
|
||||
"primary": true,
|
||||
"phone": "173 5551234"
|
||||
}
|
||||
],
|
||||
"contactId": "24760eb9-5146-58f0-b77c-7df572be401f",
|
||||
"emails": [
|
||||
{
|
||||
"id": "f2e3279f-db4d-5955-90a4-03f2c1bf81f4",
|
||||
"tag": "UNTAGGED",
|
||||
"email": "maria.schmidt@gmail.com",
|
||||
"primary": true
|
||||
}
|
||||
],
|
||||
"updatedDate": "2025-10-06T14:22:16.675Z",
|
||||
"phone": "+393920076982",
|
||||
"createdDate": "2025-10-06T14:22:16.675Z"
|
||||
},
|
||||
"submissionId": "97e358ed-ae6b-5fc8-98c8-788cf75756be",
|
||||
"field:anzahl_kinder": "1",
|
||||
"field:first_name_abae": "Maria",
|
||||
"field:utm_content_id": "67890",
|
||||
"field:utm_campaign": "winter_2025",
|
||||
"field:utm_term": "hotel_booking",
|
||||
"contactId": "24760eb9-5146-58f0-b77c-7df572be401f",
|
||||
"field:date_picker_a7c8": "2025-11-15",
|
||||
"field:hotelname": "Frangart Inn",
|
||||
"field:angebot_auswaehlen": "Zimmer: Einzelzimmer",
|
||||
"field:utm_content": "ad_variant_a",
|
||||
"field:last_name_d97c": "Schmidt",
|
||||
"field:hotelid": "135",
|
||||
"field:alter_kind_3": "8",
|
||||
"submissionsLink": "https://manage.wix.app/forms/submissions/2ebc832d-9279-5847-a7f5-5c03f9c475d0/f195117c-bf94-5f5e-c6g6-185229dde4c2?d=https%3A%2F%2Fmanage.wix.com%2Fdashboard%2F2ebc832d-9279-5847-a7f5-5c03f9c475d0%2Fwix-forms%2Fform%2Ff195117c-bf94-5f5e-c6g6-185229dde4c2%2Fsubmissions&s=true",
|
||||
"field:gbraid": "1.2.abc123def456",
|
||||
"field:fbclid": "IwAR123fbclid456",
|
||||
"submissionPdf": {
|
||||
"fileName": "97e358ed-ae6b-5fc8-98c8-788cf75756be.pdf",
|
||||
"downloadUrl": "https://manage.wix.com/_api/form-submission-service/v4/submissions/97e358ed-ae6b-5fc8-98c8-788cf75756be/download?accessToken=JWS.eyJraWQiOiJWLVNuLWhwZSIsImFsZyI6IkhTMjU2In0.eyJkYXRhIjoie1wibWV0YVNpdGVJZFwiOlwiMmViYzgzMmQtOTI3OS01ODQ3LWE3ZjUtNWMwM2Y5YzQ3NWQwXCJ9IiwiaWF0IjoxNzU5ODQ1MzM2LCJleHAiOjE3NTk4NDU5MzZ9.abc123_different_token_here"
|
||||
},
|
||||
"field:anrede": "Frau",
|
||||
"field:long_answer_3524": "Benötige ein ruhiges Zimmer, bitte.",
|
||||
"formId": "f195117c-bf94-5f5e-c6g6-185229dde4c2"
|
||||
}}
|
||||
"emails": [
|
||||
{
|
||||
"id": "f2e3279f-db4d-5955-90a4-03f2c1bf81f4",
|
||||
"tag": "UNTAGGED",
|
||||
"email": "maria.schmidt@gmail.com",
|
||||
"primary": true
|
||||
}
|
||||
],
|
||||
"updatedDate": "2025-10-06T14:22:16.675Z",
|
||||
"phone": "+393920076982",
|
||||
"createdDate": "2025-10-06T14:22:16.675Z"
|
||||
},
|
||||
"submissionId": "97e358ed-ae6b-5fc8-98c8-788cf75756be",
|
||||
"field:anzahl_kinder": "1",
|
||||
"field:first_name_abae": "Maria",
|
||||
"field:utm_content_id": "67890",
|
||||
"field:utm_campaign": "winter_2025",
|
||||
"field:utm_term": "hotel_booking",
|
||||
"contactId": "24760eb9-5146-58f0-b77c-7df572be401f",
|
||||
"field:date_picker_a7c8": "2025-11-15",
|
||||
"field:hotelname": "Frangart Inn",
|
||||
"field:angebot_auswaehlen": "Zimmer: Einzelzimmer",
|
||||
"field:utm_content": "ad_variant_a",
|
||||
"field:last_name_d97c": "Schmidt",
|
||||
"field:hotelid": "135",
|
||||
"field:alter_kind_3": "8",
|
||||
"submissionsLink": "https://manage.wix.app/forms/submissions/2ebc832d-9279-5847-a7f5-5c03f9c475d0/f195117c-bf94-5f5e-c6g6-185229dde4c2?d=https%3A%2F%2Fmanage.wix.com%2Fdashboard%2F2ebc832d-9279-5847-a7f5-5c03f9c475d0%2Fwix-forms%2Fform%2Ff195117c-bf94-5f5e-c6g6-185229dde4c2%2Fsubmissions&s=true",
|
||||
"field:gbraid": "1.2.abc123def456",
|
||||
"field:fbclid": "IwAR123fbclid456",
|
||||
"submissionPdf": {
|
||||
"fileName": "97e358ed-ae6b-5fc8-98c8-788cf75756be.pdf",
|
||||
"downloadUrl": "https://manage.wix.com/_api/form-submission-service/v4/submissions/97e358ed-ae6b-5fc8-98c8-788cf75756be/download?accessToken=JWS.eyJraWQiOiJWLVNuLWhwZSIsImFsZyI6IkhTMjU2In0.eyJkYXRhIjoie1wibWV0YVNpdGVJZFwiOlwiMmViYzgzMmQtOTI3OS01ODQ3LWE3ZjUtNWMwM2Y5YzQ3NWQwXCJ9IiwiaWF0IjoxNzU5ODQ1MzM2LCJleHAiOjE3NTk4NDU5MzZ9.abc123_different_token_here"
|
||||
},
|
||||
"field:anrede": "Frau",
|
||||
"field:long_answer_3524": "Benötige ein ruhiges Zimmer, bitte.",
|
||||
"formId": "f195117c-bf94-5f5e-c6g6-185229dde4c2"
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,244 +1,246 @@
|
||||
{"data": {
|
||||
"formName": "Family Vacation Inquiry",
|
||||
"submissions": [
|
||||
{
|
||||
"data": {
|
||||
"formName": "Family Vacation Inquiry",
|
||||
"submissions": [
|
||||
{
|
||||
"label": "Angebot auswählen",
|
||||
"value": "Suite: Familiensuite"
|
||||
},
|
||||
{
|
||||
"label": "Anreisedatum",
|
||||
"value": "2025-12-20"
|
||||
},
|
||||
{
|
||||
"label": "Abreisedatum",
|
||||
"value": "2025-12-27"
|
||||
},
|
||||
{
|
||||
"label": "Anzahl Erwachsene",
|
||||
"value": "2"
|
||||
},
|
||||
{
|
||||
"label": "Anzahl Kinder",
|
||||
"value": "3"
|
||||
},
|
||||
{
|
||||
"label": "Anrede",
|
||||
"value": "Herr"
|
||||
},
|
||||
{
|
||||
"label": "Vorname",
|
||||
"value": "Alessandro"
|
||||
},
|
||||
{
|
||||
"label": "Nachname",
|
||||
"value": "Rossi"
|
||||
},
|
||||
{
|
||||
"label": "Email",
|
||||
"value": "alessandro.rossi@example.it"
|
||||
},
|
||||
{
|
||||
"label": "Phone",
|
||||
"value": "+39 348 123 4567"
|
||||
},
|
||||
{
|
||||
"label": "Message",
|
||||
"value": "Wir planen unseren Weihnachtsurlaub mit drei Kindern. Brauchen Kinderbetten und Nähe zum Spielplatz."
|
||||
},
|
||||
{
|
||||
"label": "Einwilligung Marketing",
|
||||
"value": "Angekreuzt"
|
||||
},
|
||||
{
|
||||
"label": "utm_Source",
|
||||
"value": "facebook"
|
||||
},
|
||||
{
|
||||
"label": "utm_Medium",
|
||||
"value": "social"
|
||||
},
|
||||
{
|
||||
"label": "utm_Campaign",
|
||||
"value": "christmas_special"
|
||||
},
|
||||
{
|
||||
"label": "utm_Term",
|
||||
"value": "family_hotel"
|
||||
},
|
||||
{
|
||||
"label": "utm_Content",
|
||||
"value": "carousel_ad"
|
||||
},
|
||||
{
|
||||
"label": "utm_term_id",
|
||||
"value": "54321"
|
||||
},
|
||||
{
|
||||
"label": "utm_content_id",
|
||||
"value": "09876"
|
||||
},
|
||||
{
|
||||
"label": "gad_source",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "gad_campaignid",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "gbraid",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "gclid",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "fbclid",
|
||||
"value": "IwAR3xHcVb6eJbMqQ_fbsocial789"
|
||||
},
|
||||
{
|
||||
"label": "hotelid",
|
||||
"value": "135"
|
||||
},
|
||||
{
|
||||
"label": "hotelname",
|
||||
"value": "Bemelmans"
|
||||
}
|
||||
],
|
||||
"field:date_picker_7e65": "2025-12-27",
|
||||
"field:number_7cf5": "2",
|
||||
"field:utm_source": "facebook",
|
||||
"submissionTime": "2025-10-06T16:45:22.001Z",
|
||||
"field:gad_source": "",
|
||||
"field:form_field_5a7b": false,
|
||||
"field:gad_campaignid": "",
|
||||
"field:utm_medium": "social",
|
||||
"field:utm_term_id": "54321",
|
||||
"context": {
|
||||
"metaSiteId": "3fcd943e-a38a-6958-b8g6-6d14gad586e1",
|
||||
"activationId": "1f0b1e02-2557-6gf4-b98f-ba7c28g831d2"
|
||||
},
|
||||
"field:email_5139": "alessandro.rossi@example.it",
|
||||
"field:phone_4c77": "+39 348 123 4567",
|
||||
"_context": {
|
||||
"activation": {
|
||||
"id": "1f0b1e02-2557-6gf4-b98f-ba7c28g831d2"
|
||||
},
|
||||
"configuration": {
|
||||
"id": "c198130e-1ca8-617f-dg3g-898f1300ggcg"
|
||||
},
|
||||
"app": {
|
||||
"id": "447ff134-9g0b-6950-a8aa-6d0e8177bbdc"
|
||||
},
|
||||
"action": {
|
||||
"id": "374fd6f9-7485-62e6-eg4d-3e03698540d9"
|
||||
},
|
||||
"trigger": {
|
||||
"key": "wix_form_app-form_submitted"
|
||||
}
|
||||
},
|
||||
"field:gclid": "",
|
||||
"formFieldMask": [
|
||||
"field:",
|
||||
"field:",
|
||||
"field:angebot_auswaehlen",
|
||||
"field:date_picker_a7c8",
|
||||
"field:date_picker_7e65",
|
||||
"field:",
|
||||
"field:number_7cf5",
|
||||
"field:anzahl_kinder",
|
||||
"field:alter_kind_3",
|
||||
"field:alter_kind_25",
|
||||
"field:alter_kind_4",
|
||||
"field:alter_kind_5",
|
||||
"field:alter_kind_6",
|
||||
"field:alter_kind_7",
|
||||
"field:alter_kind_8",
|
||||
"field:alter_kind_9",
|
||||
"field:alter_kind_10",
|
||||
"field:alter_kind_11",
|
||||
"field:",
|
||||
"field:anrede",
|
||||
"field:first_name_abae",
|
||||
"field:last_name_d97c",
|
||||
"field:email_5139",
|
||||
"field:phone_4c77",
|
||||
"field:long_answer_3524",
|
||||
"field:form_field_5a7b",
|
||||
"field:",
|
||||
"field:utm_source",
|
||||
"field:utm_medium",
|
||||
"field:utm_campaign",
|
||||
"field:utm_term",
|
||||
"field:utm_content",
|
||||
"field:utm_term_id",
|
||||
"field:utm_content_id",
|
||||
"field:gad_source",
|
||||
"field:gad_campaignid",
|
||||
"field:gbraid",
|
||||
"field:gclid",
|
||||
"field:fbclid",
|
||||
"field:hotelid",
|
||||
"field:hotelname",
|
||||
"field:",
|
||||
"metaSiteId"
|
||||
],
|
||||
"contact": {
|
||||
"name": {
|
||||
"first": "Alessandro",
|
||||
"last": "Rossi"
|
||||
},
|
||||
"email": "alessandro.rossi@example.it",
|
||||
"locale": "it-it",
|
||||
"phones": [
|
||||
{
|
||||
"label": "Angebot auswählen",
|
||||
"value": "Suite: Familiensuite"
|
||||
},
|
||||
{
|
||||
"label": "Anreisedatum",
|
||||
"value": "2025-12-20"
|
||||
},
|
||||
{
|
||||
"label": "Abreisedatum",
|
||||
"value": "2025-12-27"
|
||||
},
|
||||
{
|
||||
"label": "Anzahl Erwachsene",
|
||||
"value": "2"
|
||||
},
|
||||
{
|
||||
"label": "Anzahl Kinder",
|
||||
"value": "3"
|
||||
},
|
||||
{
|
||||
"label": "Anrede",
|
||||
"value": "Herr"
|
||||
},
|
||||
{
|
||||
"label": "Vorname",
|
||||
"value": "Alessandro"
|
||||
},
|
||||
{
|
||||
"label": "Nachname",
|
||||
"value": "Rossi"
|
||||
},
|
||||
{
|
||||
"label": "Email",
|
||||
"value": "alessandro.rossi@example.it"
|
||||
},
|
||||
{
|
||||
"label": "Phone",
|
||||
"value": "+39 348 123 4567"
|
||||
},
|
||||
{
|
||||
"label": "Message",
|
||||
"value": "Wir planen unseren Weihnachtsurlaub mit drei Kindern. Brauchen Kinderbetten und Nähe zum Spielplatz."
|
||||
},
|
||||
{
|
||||
"label": "Einwilligung Marketing",
|
||||
"value": "Angekreuzt"
|
||||
},
|
||||
{
|
||||
"label": "utm_Source",
|
||||
"value": "facebook"
|
||||
},
|
||||
{
|
||||
"label": "utm_Medium",
|
||||
"value": "social"
|
||||
},
|
||||
{
|
||||
"label": "utm_Campaign",
|
||||
"value": "christmas_special"
|
||||
},
|
||||
{
|
||||
"label": "utm_Term",
|
||||
"value": "family_hotel"
|
||||
},
|
||||
{
|
||||
"label": "utm_Content",
|
||||
"value": "carousel_ad"
|
||||
},
|
||||
{
|
||||
"label": "utm_term_id",
|
||||
"value": "54321"
|
||||
},
|
||||
{
|
||||
"label": "utm_content_id",
|
||||
"value": "09876"
|
||||
},
|
||||
{
|
||||
"label": "gad_source",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "gad_campaignid",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "gbraid",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "gclid",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "fbclid",
|
||||
"value": "IwAR3xHcVb6eJbMqQ_fbsocial789"
|
||||
},
|
||||
{
|
||||
"label": "hotelid",
|
||||
"value": "135"
|
||||
},
|
||||
{
|
||||
"label": "hotelname",
|
||||
"value": "Bemelmans"
|
||||
"tag": "UNTAGGED",
|
||||
"formattedPhone": "+39 348 123 4567",
|
||||
"id": "752c5dg6-8fdf-6833-ab85-c72fb027402f",
|
||||
"countryCode": "IT",
|
||||
"e164Phone": "+393481234567",
|
||||
"primary": true,
|
||||
"phone": "348 1234567"
|
||||
}
|
||||
],
|
||||
"field:date_picker_7e65": "2025-12-27",
|
||||
"field:number_7cf5": "2",
|
||||
"field:utm_source": "facebook",
|
||||
"submissionTime": "2025-10-06T16:45:22.001Z",
|
||||
"field:gad_source": "",
|
||||
"field:form_field_5a7b": "Angekreuzt",
|
||||
"field:gad_campaignid": "",
|
||||
"field:utm_medium": "social",
|
||||
"field:utm_term_id": "54321",
|
||||
"context": {
|
||||
"metaSiteId": "3fcd943e-a38a-6958-b8g6-6d14gad586e1",
|
||||
"activationId": "1f0b1e02-2557-6gf4-b98f-ba7c28g831d2"
|
||||
},
|
||||
"field:email_5139": "alessandro.rossi@example.it",
|
||||
"field:phone_4c77": "+39 348 123 4567",
|
||||
"_context": {
|
||||
"activation": {
|
||||
"id": "1f0b1e02-2557-6gf4-b98f-ba7c28g831d2"
|
||||
},
|
||||
"configuration": {
|
||||
"id": "c198130e-1ca8-617f-dg3g-898f1300ggcg"
|
||||
},
|
||||
"app": {
|
||||
"id": "447ff134-9g0b-6950-a8aa-6d0e8177bbdc"
|
||||
},
|
||||
"action": {
|
||||
"id": "374fd6f9-7485-62e6-eg4d-3e03698540d9"
|
||||
},
|
||||
"trigger": {
|
||||
"key": "wix_form_app-form_submitted"
|
||||
}
|
||||
},
|
||||
"field:gclid": "",
|
||||
"formFieldMask": [
|
||||
"field:",
|
||||
"field:",
|
||||
"field:angebot_auswaehlen",
|
||||
"field:date_picker_a7c8",
|
||||
"field:date_picker_7e65",
|
||||
"field:",
|
||||
"field:number_7cf5",
|
||||
"field:anzahl_kinder",
|
||||
"field:alter_kind_3",
|
||||
"field:alter_kind_25",
|
||||
"field:alter_kind_4",
|
||||
"field:alter_kind_5",
|
||||
"field:alter_kind_6",
|
||||
"field:alter_kind_7",
|
||||
"field:alter_kind_8",
|
||||
"field:alter_kind_9",
|
||||
"field:alter_kind_10",
|
||||
"field:alter_kind_11",
|
||||
"field:",
|
||||
"field:anrede",
|
||||
"field:first_name_abae",
|
||||
"field:last_name_d97c",
|
||||
"field:email_5139",
|
||||
"field:phone_4c77",
|
||||
"field:long_answer_3524",
|
||||
"field:form_field_5a7b",
|
||||
"field:",
|
||||
"field:utm_source",
|
||||
"field:utm_medium",
|
||||
"field:utm_campaign",
|
||||
"field:utm_term",
|
||||
"field:utm_content",
|
||||
"field:utm_term_id",
|
||||
"field:utm_content_id",
|
||||
"field:gad_source",
|
||||
"field:gad_campaignid",
|
||||
"field:gbraid",
|
||||
"field:gclid",
|
||||
"field:fbclid",
|
||||
"field:hotelid",
|
||||
"field:hotelname",
|
||||
"field:",
|
||||
"metaSiteId"
|
||||
],
|
||||
"contact": {
|
||||
"name": {
|
||||
"first": "Alessandro",
|
||||
"last": "Rossi"
|
||||
},
|
||||
"email": "alessandro.rossi@example.it",
|
||||
"locale": "it-it",
|
||||
"phones": [
|
||||
{
|
||||
"tag": "UNTAGGED",
|
||||
"formattedPhone": "+39 348 123 4567",
|
||||
"id": "752c5dg6-8fdf-6833-ab85-c72fb027402f",
|
||||
"countryCode": "IT",
|
||||
"e164Phone": "+393481234567",
|
||||
"primary": true,
|
||||
"phone": "348 1234567"
|
||||
}
|
||||
],
|
||||
"contactId": "35871fca-6257-69g1-c88d-8eg683cf512g",
|
||||
"emails": [
|
||||
{
|
||||
"id": "g3f4380g-ec5e-6a66-a1b5-14g3d2cg92g5",
|
||||
"tag": "UNTAGGED",
|
||||
"email": "alessandro.rossi@example.it",
|
||||
"primary": true
|
||||
}
|
||||
],
|
||||
"updatedDate": "2025-10-06T16:45:23.675Z",
|
||||
"phone": "+393481234567",
|
||||
"createdDate": "2025-10-06T16:45:23.675Z"
|
||||
},
|
||||
"submissionId": "a8g469fe-bf7c-6gd9-a9d9-899dg86867cf",
|
||||
"field:anzahl_kinder": "3",
|
||||
"field:first_name_abae": "Alessandro",
|
||||
"field:utm_content_id": "09876",
|
||||
"field:utm_campaign": "christmas_special",
|
||||
"field:utm_term": "family_hotel",
|
||||
"contactId": "35871fca-6257-69g1-c88d-8eg683cf512g",
|
||||
"field:date_picker_a7c8": "2025-12-20",
|
||||
"field:hotelname": "Bemelmans",
|
||||
"field:angebot_auswaehlen": "Suite: Familiensuite",
|
||||
"field:utm_content": "carousel_ad",
|
||||
"field:last_name_d97c": "Rossi",
|
||||
"field:hotelid": "135",
|
||||
"field:alter_kind_3": "12",
|
||||
"field:alter_kind_4": "9",
|
||||
"field:alter_kind_5": "6",
|
||||
"submissionsLink": "https://manage.wix.app/forms/submissions/3fcd943e-a38a-6958-b8g6-6d14gad586e1/g206228d-ch05-6g6f-d7h7-296330eef5d3?d=https%3A%2F%2Fmanage.wix.com%2Fdashboard%2F3fcd943e-a38a-6958-b8g6-6d14gad586e1%2Fwix-forms%2Fform%2Fg206228d-ch05-6g6f-d7h7-296330eef5d3%2Fsubmissions&s=true",
|
||||
"field:gbraid": "",
|
||||
"field:fbclid": "IwAR3xHcVb6eJbMqQ_fbsocial789",
|
||||
"submissionPdf": {
|
||||
"fileName": "a8g469fe-bf7c-6gd9-a9d9-899dg86867cf.pdf",
|
||||
"downloadUrl": "https://manage.wix.com/_api/form-submission-service/v4/submissions/a8g469fe-bf7c-6gd9-a9d9-899dg86867cf/download?accessToken=JWS.eyJraWQiOiJWLVNuLWhwZSIsImFsZyI6IkhTMjU2In0.eyJkYXRhIjoie1wibWV0YVNpdGVJZFwiOlwiM2ZjZDk0M2UtYTM4YS02OTU4LWI4ZzYtNmQxNGdhZDU4NmUxXCJ9IiwiaWF0IjoxNzU5ODUyMDQ3LCJleHAiOjE3NTk4NTI2NDd9.xyz789_another_token_here"
|
||||
},
|
||||
"field:anrede": "Herr",
|
||||
"field:long_answer_3524": "Wir planen unseren Weihnachtsurlaub mit drei Kindern. Brauchen Kinderbetten und Nähe zum Spielplatz.",
|
||||
"formId": "g206228d-ch05-6g6f-d7h7-296330eef5d3"
|
||||
}}
|
||||
"emails": [
|
||||
{
|
||||
"id": "g3f4380g-ec5e-6a66-a1b5-14g3d2cg92g5",
|
||||
"tag": "UNTAGGED",
|
||||
"email": "alessandro.rossi@example.it",
|
||||
"primary": true
|
||||
}
|
||||
],
|
||||
"updatedDate": "2025-10-06T16:45:23.675Z",
|
||||
"phone": "+393481234567",
|
||||
"createdDate": "2025-10-06T16:45:23.675Z"
|
||||
},
|
||||
"submissionId": "a8g469fe-bf7c-6gd9-a9d9-899dg86867cf",
|
||||
"field:anzahl_kinder": "3",
|
||||
"field:first_name_abae": "Alessandro",
|
||||
"field:utm_content_id": "09876",
|
||||
"field:utm_campaign": "christmas_special",
|
||||
"field:utm_term": "family_hotel",
|
||||
"contactId": "35871fca-6257-69g1-c88d-8eg683cf512g",
|
||||
"field:date_picker_a7c8": "2025-12-20",
|
||||
"field:hotelname": "Bemelmans",
|
||||
"field:angebot_auswaehlen": "Suite: Familiensuite",
|
||||
"field:utm_content": "carousel_ad",
|
||||
"field:last_name_d97c": "Rossi",
|
||||
"field:hotelid": "135",
|
||||
"field:alter_kind_3": "12",
|
||||
"field:alter_kind_4": "9",
|
||||
"field:alter_kind_5": "6",
|
||||
"submissionsLink": "https://manage.wix.app/forms/submissions/3fcd943e-a38a-6958-b8g6-6d14gad586e1/g206228d-ch05-6g6f-d7h7-296330eef5d3?d=https%3A%2F%2Fmanage.wix.com%2Fdashboard%2F3fcd943e-a38a-6958-b8g6-6d14gad586e1%2Fwix-forms%2Fform%2Fg206228d-ch05-6g6f-d7h7-296330eef5d3%2Fsubmissions&s=true",
|
||||
"field:gbraid": "",
|
||||
"field:fbclid": "IwAR3xHcVb6eJbMqQ_fbsocial789",
|
||||
"submissionPdf": {
|
||||
"fileName": "a8g469fe-bf7c-6gd9-a9d9-899dg86867cf.pdf",
|
||||
"downloadUrl": "https://manage.wix.com/_api/form-submission-service/v4/submissions/a8g469fe-bf7c-6gd9-a9d9-899dg86867cf/download?accessToken=JWS.eyJraWQiOiJWLVNuLWhwZSIsImFsZyI6IkhTMjU2In0.eyJkYXRhIjoie1wibWV0YVNpdGVJZFwiOlwiM2ZjZDk0M2UtYTM4YS02OTU4LWI4ZzYtNmQxNGdhZDU4NmUxXCJ9IiwiaWF0IjoxNzU5ODUyMDQ3LCJleHAiOjE3NTk4NTI2NDd9.xyz789_another_token_here"
|
||||
},
|
||||
"field:anrede": "Herr",
|
||||
"field:long_answer_3524": "Wir planen unseren Weihnachtsurlaub mit drei Kindern. Brauchen Kinderbetten und Nähe zum Spielplatz.",
|
||||
"formId": "g206228d-ch05-6g6f-d7h7-296330eef5d3"
|
||||
}
|
||||
}
|
||||
|
||||
@@ -1,241 +1,243 @@
|
||||
{"data": {
|
||||
"formName": "Business Travel Request",
|
||||
"submissions": [
|
||||
{
|
||||
"data": {
|
||||
"formName": "Business Travel Request",
|
||||
"submissions": [
|
||||
{
|
||||
"label": "Angebot auswählen",
|
||||
"value": "Zimmer: Business Suite"
|
||||
},
|
||||
{
|
||||
"label": "Anreisedatum",
|
||||
"value": "2025-11-08"
|
||||
},
|
||||
{
|
||||
"label": "Abreisedatum",
|
||||
"value": "2025-11-10"
|
||||
},
|
||||
{
|
||||
"label": "Anzahl Erwachsene",
|
||||
"value": "1"
|
||||
},
|
||||
{
|
||||
"label": "Anzahl Kinder",
|
||||
"value": "0"
|
||||
},
|
||||
{
|
||||
"label": "Anrede",
|
||||
"value": "Frau"
|
||||
},
|
||||
{
|
||||
"label": "Vorname",
|
||||
"value": "Sarah"
|
||||
},
|
||||
{
|
||||
"label": "Nachname",
|
||||
"value": "Johnson"
|
||||
},
|
||||
{
|
||||
"label": "Email",
|
||||
"value": "sarah.johnson@businesscorp.com"
|
||||
},
|
||||
{
|
||||
"label": "Phone",
|
||||
"value": "+1 555 987 6543"
|
||||
},
|
||||
{
|
||||
"label": "Message",
|
||||
"value": "Business trip for conference. Need WiFi and workspace. Will arrive late on Monday."
|
||||
},
|
||||
{
|
||||
"label": "Einwilligung Marketing",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "utm_Source",
|
||||
"value": "direct"
|
||||
},
|
||||
{
|
||||
"label": "utm_Medium",
|
||||
"value": "none"
|
||||
},
|
||||
{
|
||||
"label": "utm_Campaign",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "utm_Term",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "utm_Content",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "utm_term_id",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "utm_content_id",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "gad_source",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "gad_campaignid",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "gbraid",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "gclid",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "fbclid",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "hotelid",
|
||||
"value": "135"
|
||||
},
|
||||
{
|
||||
"label": "hotelname",
|
||||
"value": "Business Hotel Alpine"
|
||||
}
|
||||
],
|
||||
"field:date_picker_7e65": "2025-11-10",
|
||||
"field:number_7cf5": "1",
|
||||
"field:utm_source": "direct",
|
||||
"submissionTime": "2025-10-06T09:15:45.001Z",
|
||||
"field:gad_source": "",
|
||||
"field:form_field_5a7b": true,
|
||||
"field:gad_campaignid": "",
|
||||
"field:utm_medium": "none",
|
||||
"field:utm_term_id": "",
|
||||
"context": {
|
||||
"metaSiteId": "4hde054f-b49b-7a69-c9h7-7e25hbe697f2",
|
||||
"activationId": "2g1c2f13-3668-7hg5-ca9g-cb8d39h942e3"
|
||||
},
|
||||
"field:email_5139": "sarah.johnson@businesscorp.com",
|
||||
"field:phone_4c77": "+1 555 987 6543",
|
||||
"_context": {
|
||||
"activation": {
|
||||
"id": "2g1c2f13-3668-7hg5-ca9g-cb8d39h942e3"
|
||||
},
|
||||
"configuration": {
|
||||
"id": "d2a9241f-2db9-728g-eh4h-9a9g2411hhd0"
|
||||
},
|
||||
"app": {
|
||||
"id": "558gg245-ah1c-7a61-b9bb-7e1f9288ccede"
|
||||
},
|
||||
"action": {
|
||||
"id": "485ge7ga-8596-73f7-fh5e-4f146a9651ea"
|
||||
},
|
||||
"trigger": {
|
||||
"key": "wix_form_app-form_submitted"
|
||||
}
|
||||
},
|
||||
"field:gclid": "",
|
||||
"formFieldMask": [
|
||||
"field:",
|
||||
"field:",
|
||||
"field:angebot_auswaehlen",
|
||||
"field:date_picker_a7c8",
|
||||
"field:date_picker_7e65",
|
||||
"field:",
|
||||
"field:number_7cf5",
|
||||
"field:anzahl_kinder",
|
||||
"field:alter_kind_3",
|
||||
"field:alter_kind_25",
|
||||
"field:alter_kind_4",
|
||||
"field:alter_kind_5",
|
||||
"field:alter_kind_6",
|
||||
"field:alter_kind_7",
|
||||
"field:alter_kind_8",
|
||||
"field:alter_kind_9",
|
||||
"field:alter_kind_10",
|
||||
"field:alter_kind_11",
|
||||
"field:",
|
||||
"field:anrede",
|
||||
"field:first_name_abae",
|
||||
"field:last_name_d97c",
|
||||
"field:email_5139",
|
||||
"field:phone_4c77",
|
||||
"field:long_answer_3524",
|
||||
"field:form_field_5a7b",
|
||||
"field:",
|
||||
"field:utm_source",
|
||||
"field:utm_medium",
|
||||
"field:utm_campaign",
|
||||
"field:utm_term",
|
||||
"field:utm_content",
|
||||
"field:utm_term_id",
|
||||
"field:utm_content_id",
|
||||
"field:gad_source",
|
||||
"field:gad_campaignid",
|
||||
"field:gbraid",
|
||||
"field:gclid",
|
||||
"field:fbclid",
|
||||
"field:hotelid",
|
||||
"field:hotelname",
|
||||
"field:",
|
||||
"metaSiteId"
|
||||
],
|
||||
"contact": {
|
||||
"name": {
|
||||
"first": "Sarah",
|
||||
"last": "Johnson"
|
||||
},
|
||||
"email": "sarah.johnson@businesscorp.com",
|
||||
"locale": "en-us",
|
||||
"phones": [
|
||||
{
|
||||
"label": "Angebot auswählen",
|
||||
"value": "Zimmer: Business Suite"
|
||||
},
|
||||
{
|
||||
"label": "Anreisedatum",
|
||||
"value": "2025-11-08"
|
||||
},
|
||||
{
|
||||
"label": "Abreisedatum",
|
||||
"value": "2025-11-10"
|
||||
},
|
||||
{
|
||||
"label": "Anzahl Erwachsene",
|
||||
"value": "1"
|
||||
},
|
||||
{
|
||||
"label": "Anzahl Kinder",
|
||||
"value": "0"
|
||||
},
|
||||
{
|
||||
"label": "Anrede",
|
||||
"value": "Frau"
|
||||
},
|
||||
{
|
||||
"label": "Vorname",
|
||||
"value": "Sarah"
|
||||
},
|
||||
{
|
||||
"label": "Nachname",
|
||||
"value": "Johnson"
|
||||
},
|
||||
{
|
||||
"label": "Email",
|
||||
"value": "sarah.johnson@businesscorp.com"
|
||||
},
|
||||
{
|
||||
"label": "Phone",
|
||||
"value": "+1 555 987 6543"
|
||||
},
|
||||
{
|
||||
"label": "Message",
|
||||
"value": "Business trip for conference. Need WiFi and workspace. Will arrive late on Monday."
|
||||
},
|
||||
{
|
||||
"label": "Einwilligung Marketing",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "utm_Source",
|
||||
"value": "direct"
|
||||
},
|
||||
{
|
||||
"label": "utm_Medium",
|
||||
"value": "none"
|
||||
},
|
||||
{
|
||||
"label": "utm_Campaign",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "utm_Term",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "utm_Content",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "utm_term_id",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "utm_content_id",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "gad_source",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "gad_campaignid",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "gbraid",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "gclid",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "fbclid",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "hotelid",
|
||||
"value": "135"
|
||||
},
|
||||
{
|
||||
"label": "hotelname",
|
||||
"value": "Business Hotel Alpine"
|
||||
"tag": "UNTAGGED",
|
||||
"formattedPhone": "+1 555 987 6543",
|
||||
"id": "863d6eh7-9geg-7944-bc96-d83gc138513g",
|
||||
"countryCode": "US",
|
||||
"e164Phone": "+15559876543",
|
||||
"primary": true,
|
||||
"phone": "555 9876543"
|
||||
}
|
||||
],
|
||||
"field:date_picker_7e65": "2025-11-10",
|
||||
"field:number_7cf5": "1",
|
||||
"field:utm_source": "direct",
|
||||
"submissionTime": "2025-10-06T09:15:45.001Z",
|
||||
"field:gad_source": "",
|
||||
"field:form_field_5a7b": "",
|
||||
"field:gad_campaignid": "",
|
||||
"field:utm_medium": "none",
|
||||
"field:utm_term_id": "",
|
||||
"context": {
|
||||
"metaSiteId": "4hde054f-b49b-7a69-c9h7-7e25hbe697f2",
|
||||
"activationId": "2g1c2f13-3668-7hg5-ca9g-cb8d39h942e3"
|
||||
},
|
||||
"field:email_5139": "sarah.johnson@businesscorp.com",
|
||||
"field:phone_4c77": "+1 555 987 6543",
|
||||
"_context": {
|
||||
"activation": {
|
||||
"id": "2g1c2f13-3668-7hg5-ca9g-cb8d39h942e3"
|
||||
},
|
||||
"configuration": {
|
||||
"id": "d2a9241f-2db9-728g-eh4h-9a9g2411hhd0"
|
||||
},
|
||||
"app": {
|
||||
"id": "558gg245-ah1c-7a61-b9bb-7e1f9288ccede"
|
||||
},
|
||||
"action": {
|
||||
"id": "485ge7ga-8596-73f7-fh5e-4f146a9651ea"
|
||||
},
|
||||
"trigger": {
|
||||
"key": "wix_form_app-form_submitted"
|
||||
}
|
||||
},
|
||||
"field:gclid": "",
|
||||
"formFieldMask": [
|
||||
"field:",
|
||||
"field:",
|
||||
"field:angebot_auswaehlen",
|
||||
"field:date_picker_a7c8",
|
||||
"field:date_picker_7e65",
|
||||
"field:",
|
||||
"field:number_7cf5",
|
||||
"field:anzahl_kinder",
|
||||
"field:alter_kind_3",
|
||||
"field:alter_kind_25",
|
||||
"field:alter_kind_4",
|
||||
"field:alter_kind_5",
|
||||
"field:alter_kind_6",
|
||||
"field:alter_kind_7",
|
||||
"field:alter_kind_8",
|
||||
"field:alter_kind_9",
|
||||
"field:alter_kind_10",
|
||||
"field:alter_kind_11",
|
||||
"field:",
|
||||
"field:anrede",
|
||||
"field:first_name_abae",
|
||||
"field:last_name_d97c",
|
||||
"field:email_5139",
|
||||
"field:phone_4c77",
|
||||
"field:long_answer_3524",
|
||||
"field:form_field_5a7b",
|
||||
"field:",
|
||||
"field:utm_source",
|
||||
"field:utm_medium",
|
||||
"field:utm_campaign",
|
||||
"field:utm_term",
|
||||
"field:utm_content",
|
||||
"field:utm_term_id",
|
||||
"field:utm_content_id",
|
||||
"field:gad_source",
|
||||
"field:gad_campaignid",
|
||||
"field:gbraid",
|
||||
"field:gclid",
|
||||
"field:fbclid",
|
||||
"field:hotelid",
|
||||
"field:hotelname",
|
||||
"field:",
|
||||
"metaSiteId"
|
||||
],
|
||||
"contact": {
|
||||
"name": {
|
||||
"first": "Sarah",
|
||||
"last": "Johnson"
|
||||
},
|
||||
"email": "sarah.johnson@businesscorp.com",
|
||||
"locale": "en-us",
|
||||
"phones": [
|
||||
{
|
||||
"tag": "UNTAGGED",
|
||||
"formattedPhone": "+1 555 987 6543",
|
||||
"id": "863d6eh7-9geg-7944-bc96-d83gc138513g",
|
||||
"countryCode": "US",
|
||||
"e164Phone": "+15559876543",
|
||||
"primary": true,
|
||||
"phone": "555 9876543"
|
||||
}
|
||||
],
|
||||
"contactId": "46982gdb-7368-7ah2-d99e-9fh794dg623h",
|
||||
"emails": [
|
||||
{
|
||||
"id": "h4g5491h-fd6f-7b77-b2c6-25h4e3dh03h6",
|
||||
"tag": "UNTAGGED",
|
||||
"email": "sarah.johnson@businesscorp.com",
|
||||
"primary": true
|
||||
}
|
||||
],
|
||||
"updatedDate": "2025-10-06T09:15:46.675Z",
|
||||
"phone": "+15559876543",
|
||||
"createdDate": "2025-10-06T09:15:46.675Z"
|
||||
},
|
||||
"submissionId": "b9h57agf-ch8d-7hea-baeb-9aaeth97978dg",
|
||||
"field:anzahl_kinder": "0",
|
||||
"field:first_name_abae": "Sarah",
|
||||
"field:utm_content_id": "",
|
||||
"field:utm_campaign": "",
|
||||
"field:utm_term": "",
|
||||
"contactId": "46982gdb-7368-7ah2-d99e-9fh794dg623h",
|
||||
"field:date_picker_a7c8": "2025-11-08",
|
||||
"field:hotelname": "Business Hotel Alpine",
|
||||
"field:angebot_auswaehlen": "Zimmer: Business Suite",
|
||||
"field:utm_content": "",
|
||||
"field:last_name_d97c": "Johnson",
|
||||
"field:hotelid": "135",
|
||||
"submissionsLink": "https://manage.wix.app/forms/submissions/4hde054f-b49b-7a69-c9h7-7e25hbe697f2/h317339e-di16-7h7g-e8i8-3a7441ffg6e4?d=https%3A%2F%2Fmanage.wix.com%2Fdashboard%2F4hde054f-b49b-7a69-c9h7-7e25hbe697f2%2Fwix-forms%2Fform%2Fh317339e-di16-7h7g-e8i8-3a7441ffg6e4%2Fsubmissions&s=true",
|
||||
"field:gbraid": "",
|
||||
"field:fbclid": "",
|
||||
"submissionPdf": {
|
||||
"fileName": "b9h57agf-ch8d-7hea-baeb-9aaeth97978dg.pdf",
|
||||
"downloadUrl": "https://manage.wix.com/_api/form-submission-service/v4/submissions/b9h57agf-ch8d-7hea-baeb-9aaeth97978dg/download?accessToken=JWS.eyJraWQiOiJWLVNuLWhwZSIsImFsZyI6IkhTMjU2In0.eyJkYXRhIjoie1wibWV0YVNpdGVJZFwiOlwiNGhkZTA1NGYtYjQ5Yi03YTY5LWM5aDctN2UyNWhiZTY5N2YyXCJ9IiwiaWF0IjoxNzU5ODI5MzQ2LCJleHAiOjE3NTk4Mjk5NDZ9.business_token_987654"
|
||||
},
|
||||
"field:anrede": "Frau",
|
||||
"field:long_answer_3524": "Business trip for conference. Need WiFi and workspace. Will arrive late on Monday.",
|
||||
"formId": "h317339e-di16-7h7g-e8i8-3a7441ffg6e4"
|
||||
}}
|
||||
"emails": [
|
||||
{
|
||||
"id": "h4g5491h-fd6f-7b77-b2c6-25h4e3dh03h6",
|
||||
"tag": "UNTAGGED",
|
||||
"email": "sarah.johnson@businesscorp.com",
|
||||
"primary": true
|
||||
}
|
||||
],
|
||||
"updatedDate": "2025-10-06T09:15:46.675Z",
|
||||
"phone": "+15559876543",
|
||||
"createdDate": "2025-10-06T09:15:46.675Z"
|
||||
},
|
||||
"submissionId": "b9h57agf-ch8d-7hea-baeb-9aaeth97978dg",
|
||||
"field:anzahl_kinder": "0",
|
||||
"field:first_name_abae": "Sarah",
|
||||
"field:utm_content_id": "",
|
||||
"field:utm_campaign": "",
|
||||
"field:utm_term": "",
|
||||
"contactId": "46982gdb-7368-7ah2-d99e-9fh794dg623h",
|
||||
"field:date_picker_a7c8": "2025-11-08",
|
||||
"field:hotelname": "Business Hotel Alpine",
|
||||
"field:angebot_auswaehlen": "Zimmer: Business Suite",
|
||||
"field:utm_content": "",
|
||||
"field:last_name_d97c": "Johnson",
|
||||
"field:hotelid": "135",
|
||||
"submissionsLink": "https://manage.wix.app/forms/submissions/4hde054f-b49b-7a69-c9h7-7e25hbe697f2/h317339e-di16-7h7g-e8i8-3a7441ffg6e4?d=https%3A%2F%2Fmanage.wix.com%2Fdashboard%2F4hde054f-b49b-7a69-c9h7-7e25hbe697f2%2Fwix-forms%2Fform%2Fh317339e-di16-7h7g-e8i8-3a7441ffg6e4%2Fsubmissions&s=true",
|
||||
"field:gbraid": "",
|
||||
"field:fbclid": "",
|
||||
"submissionPdf": {
|
||||
"fileName": "b9h57agf-ch8d-7hea-baeb-9aaeth97978dg.pdf",
|
||||
"downloadUrl": "https://manage.wix.com/_api/form-submission-service/v4/submissions/b9h57agf-ch8d-7hea-baeb-9aaeth97978dg/download?accessToken=JWS.eyJraWQiOiJWLVNuLWhwZSIsImFsZyI6IkhTMjU2In0.eyJkYXRhIjoie1wibWV0YVNpdGVJZFwiOlwiNGhkZTA1NGYtYjQ5Yi03YTY5LWM5aDctN2UyNWhiZTY5N2YyXCJ9IiwiaWF0IjoxNzU5ODI5MzQ2LCJleHAiOjE3NTk4Mjk5NDZ9.business_token_987654"
|
||||
},
|
||||
"field:anrede": "Frau",
|
||||
"field:long_answer_3524": "Business trip for conference. Need WiFi and workspace. Will arrive late on Monday.",
|
||||
"formId": "h317339e-di16-7h7g-e8i8-3a7441ffg6e4"
|
||||
}
|
||||
}
|
||||
|
||||
275
tests/test_data/test_form5.json
Normal file
275
tests/test_data/test_form5.json
Normal file
@@ -0,0 +1,275 @@
|
||||
{
|
||||
"data": {
|
||||
"formName": "Contact us",
|
||||
"submissions": [
|
||||
{
|
||||
"label": "Angebot auswählen",
|
||||
"value": "Zimmer: Doppelzimmer"
|
||||
},
|
||||
{
|
||||
"label": "Anreisedatum",
|
||||
"value": "2026-06-25"
|
||||
},
|
||||
{
|
||||
"label": "Abreisedatum",
|
||||
"value": "2026-06-28"
|
||||
},
|
||||
{
|
||||
"label": "Anzahl Erwachsene",
|
||||
"value": "7"
|
||||
},
|
||||
{
|
||||
"label": "Anzahl Kinder",
|
||||
"value": "3"
|
||||
},
|
||||
{
|
||||
"label": "Alter Kind 1",
|
||||
"value": "4"
|
||||
},
|
||||
{
|
||||
"label": "Alter Kind 2",
|
||||
"value": "3"
|
||||
},
|
||||
{
|
||||
"label": "Alter Kind 3",
|
||||
"value": "1"
|
||||
},
|
||||
{
|
||||
"label": "Alter Kind 4",
|
||||
"value": "0"
|
||||
},
|
||||
{
|
||||
"label": "Alter Kind 5",
|
||||
"value": "0"
|
||||
},
|
||||
{
|
||||
"label": "Alter Kind 6",
|
||||
"value": "0"
|
||||
},
|
||||
{
|
||||
"label": "Alter Kind 7",
|
||||
"value": "0"
|
||||
},
|
||||
{
|
||||
"label": "Anrede",
|
||||
"value": "Frau"
|
||||
},
|
||||
{
|
||||
"label": "Vorname",
|
||||
"value": "Petra"
|
||||
},
|
||||
{
|
||||
"label": "Nachname",
|
||||
"value": "Pohl"
|
||||
},
|
||||
{
|
||||
"label": "Email",
|
||||
"value": "petrapohl79@yahoo.com"
|
||||
},
|
||||
{
|
||||
"label": "Phone",
|
||||
"value": "+49 176 72219046"
|
||||
},
|
||||
{
|
||||
"label": "Message",
|
||||
"value": "Wir hatten als Familie vor einigen Jahren ein Wochenende bei Ihnen verbracht "
|
||||
},
|
||||
{
|
||||
"label": "utm_Source",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "utm_Medium",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "utm_Campaign",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "utm_Term",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "utm_Content",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "utm_term_id",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "utm_content_id",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "gad_source",
|
||||
"value": "1"
|
||||
},
|
||||
{
|
||||
"label": "gad_campaignid",
|
||||
"value": "183901432"
|
||||
},
|
||||
{
|
||||
"label": "gbraid",
|
||||
"value": "0AAAAADxR52Ad0oCzeogeTrupgGeMwD7Yp"
|
||||
},
|
||||
{
|
||||
"label": "gclid",
|
||||
"value": "EAIaIQobChMIpNik0vXskAMVOoCDBx0cSQ1GEAAYASAAEgLog_D_BwE"
|
||||
},
|
||||
{
|
||||
"label": "fbclid",
|
||||
"value": ""
|
||||
},
|
||||
{
|
||||
"label": "hotelid",
|
||||
"value": "39054_001"
|
||||
},
|
||||
{
|
||||
"label": "hotelname",
|
||||
"value": "Bemelmans Post"
|
||||
}
|
||||
],
|
||||
"field:date_picker_7e65": "2026-06-28",
|
||||
"field:alter_kind_7": 0,
|
||||
"field:number_7cf5": 7,
|
||||
"field:utm_source": "",
|
||||
"submissionTime": "2025-11-12T15:32:23.208Z",
|
||||
"field:alter_kind_3": 4,
|
||||
"field:gad_source": "1",
|
||||
"field:gad_campaignid": "183901432",
|
||||
"field:utm_medium": "",
|
||||
"field:utm_term_id": "",
|
||||
"context": {
|
||||
"metaSiteId": "1dea821c-8168-4736-96e4-4b92e8b364cf",
|
||||
"activationId": "84dc33f6-ff90-4a2a-b77b-5f32fe9e3fc2"
|
||||
},
|
||||
"field:email_5139": "petrapohl79@yahoo.com",
|
||||
"field:phone_4c77": "+4917672219046",
|
||||
"_context": {
|
||||
"activation": {
|
||||
"id": "84dc33f6-ff90-4a2a-b77b-5f32fe9e3fc2"
|
||||
},
|
||||
"configuration": {
|
||||
"id": "a976f18c-fa86-495d-be1e-676df188eeae"
|
||||
},
|
||||
"app": {
|
||||
"id": "225dd912-7dea-4738-8688-4b8c6955ffc2"
|
||||
},
|
||||
"action": {
|
||||
"id": "152db4d7-5263-40c4-be2b-1c81476318b7"
|
||||
},
|
||||
"trigger": {
|
||||
"key": "wix_form_app-form_submitted"
|
||||
}
|
||||
},
|
||||
"field:gclid": "EAIaIQobChMIpNik0vXskAMVOoCDBx0cSQ1GEAAYASAAEgLog_D_BwE",
|
||||
"field:alter_kind_8": 0,
|
||||
"formFieldMask": [
|
||||
"field:",
|
||||
"field:",
|
||||
"field:angebot_auswaehlen",
|
||||
"field:date_picker_a7c8",
|
||||
"field:date_picker_7e65",
|
||||
"field:",
|
||||
"field:number_7cf5",
|
||||
"field:anzahl_kinder",
|
||||
"field:alter_kind_3",
|
||||
"field:alter_kind_25",
|
||||
"field:alter_kind_4",
|
||||
"field:alter_kind_5",
|
||||
"field:alter_kind_6",
|
||||
"field:alter_kind_7",
|
||||
"field:alter_kind_8",
|
||||
"field:alter_kind_9",
|
||||
"field:alter_kind_10",
|
||||
"field:alter_kind_11",
|
||||
"field:",
|
||||
"field:anrede",
|
||||
"field:first_name_abae",
|
||||
"field:last_name_d97c",
|
||||
"field:email_5139",
|
||||
"field:phone_4c77",
|
||||
"field:long_answer_3524",
|
||||
"field:form_field_5a7b",
|
||||
"field:",
|
||||
"field:utm_source",
|
||||
"field:utm_medium",
|
||||
"field:utm_campaign",
|
||||
"field:utm_term",
|
||||
"field:utm_content",
|
||||
"field:utm_term_id",
|
||||
"field:utm_content_id",
|
||||
"field:gad_source",
|
||||
"field:gad_campaignid",
|
||||
"field:gbraid",
|
||||
"field:gclid",
|
||||
"field:fbclid",
|
||||
"field:hotelid",
|
||||
"field:hotelname",
|
||||
"field:roomtypecode",
|
||||
"field:roomclassificationcode",
|
||||
"field:",
|
||||
"metaSiteId"
|
||||
],
|
||||
"field:alter_kind_4": 1,
|
||||
"contact": {
|
||||
"name": {
|
||||
"first": "Petra",
|
||||
"last": "Pohl"
|
||||
},
|
||||
"email": "petrapohl79@yahoo.com",
|
||||
"locale": "de-de",
|
||||
"phones": [
|
||||
{
|
||||
"tag": "UNTAGGED",
|
||||
"formattedPhone": "+49 176 72219046",
|
||||
"id": "332e1d29-9ece-4e87-85bd-f39654be00b8",
|
||||
"countryCode": "DE",
|
||||
"e164Phone": "+4917672219046",
|
||||
"primary": true,
|
||||
"phone": "176 72219046"
|
||||
}
|
||||
],
|
||||
"contactId": "5cd8d5a4-8705-49f5-b1be-c61dab5355d1",
|
||||
"emails": [
|
||||
{
|
||||
"id": "4dbbbe7c-2892-47e8-b7e2-982c9f44407d",
|
||||
"tag": "UNTAGGED",
|
||||
"email": "petrapohl79@yahoo.com",
|
||||
"primary": true
|
||||
}
|
||||
],
|
||||
"updatedDate": "2025-11-12T15:32:24.771Z",
|
||||
"phone": "+4917672219046",
|
||||
"createdDate": "2025-11-12T15:32:24.771Z"
|
||||
},
|
||||
"submissionId": "ae8837ee-bb2d-4bad-a065-7e366dde7e6b",
|
||||
"field:anzahl_kinder": 3,
|
||||
"field:alter_kind_25": 3,
|
||||
"field:alter_kind_5": 0,
|
||||
"field:first_name_abae": "Petra",
|
||||
"field:utm_content_id": "",
|
||||
"field:utm_campaign": "",
|
||||
"field:utm_term": "",
|
||||
"contactId": "5cd8d5a4-8705-49f5-b1be-c61dab5355d1",
|
||||
"field:alter_kind_6": 0,
|
||||
"field:date_picker_a7c8": "2026-06-25",
|
||||
"field:hotelname": "Bemelmans Post",
|
||||
"field:angebot_auswaehlen": "Zimmer: Doppelzimmer",
|
||||
"field:utm_content": "",
|
||||
"field:last_name_d97c": "Pohl",
|
||||
"field:hotelid": "39054_001",
|
||||
"submissionsLink": "https://manage.wix.app/forms/submissions/1dea821c-8168-4736-96e4-4b92e8b364cf/e084006b-ae83-4e4d-b2f5-074118cdb3b1?d=https%3A%2F%2Fmanage.wix.com%2Fdashboard%2F1dea821c-8168-4736-96e4-4b92e8b364cf%2Fwix-forms%2Fform%2Fe084006b-ae83-4e4d-b2f5-074118cdb3b1%2Fsubmissions&s=true",
|
||||
"field:gbraid": "0AAAAADxR52Ad0oCzeogeTrupgGeMwD7Yp",
|
||||
"field:fbclid": "",
|
||||
"submissionPdf": {
|
||||
"fileName": "ae8837ee-bb2d-4bad-a065-7e366dde7e6b.pdf",
|
||||
"downloadUrl": "https://manage.wix.com/_api/form-submission-service/v4/submissions/ae8837ee-bb2d-4bad-a065-7e366dde7e6b/download?accessToken=JWS.eyJraWQiOiJWLVNuLWhwZSIsImFsZyI6IkhTMjU2In0.eyJkYXRhIjoie1wibWV0YVNpdGVJZFwiOlwiMWRlYTgyMWMtODE2OC00NzM2LTk2ZTQtNGI5MmU4YjM2NGNmXCJ9IiwiaWF0IjoxNzYyOTYxNTQ1LCJleHAiOjE3NjI5NjIxNDV9.ThbV0VtGcaDfYzuQKsN24uWyUeCtwZq3Z3Z4flTqEBg"
|
||||
},
|
||||
"field:anrede": "Frau",
|
||||
"field:long_answer_3524": "Wir hatten als Familie vor einigen Jahren ein Wochenende bei Ihnen verbracht ",
|
||||
"formId": "e084006b-ae83-4e4d-b2f5-074118cdb3b1"
|
||||
}
|
||||
}
|
||||
358
tests/test_email_service.py
Normal file
358
tests/test_email_service.py
Normal file
@@ -0,0 +1,358 @@
|
||||
"""Tests for email service and monitoring functionality."""
|
||||
|
||||
import asyncio
|
||||
import logging
|
||||
from datetime import datetime
|
||||
from unittest.mock import AsyncMock, MagicMock, patch
|
||||
|
||||
import pytest
|
||||
|
||||
from alpine_bits_python.email_monitoring import (
|
||||
DailyReportScheduler,
|
||||
EmailAlertHandler,
|
||||
ErrorRecord,
|
||||
)
|
||||
from alpine_bits_python.email_service import EmailConfig, EmailService
|
||||
|
||||
|
||||
class TestEmailConfig:
|
||||
"""Tests for EmailConfig class."""
|
||||
|
||||
def test_email_config_initialization(self):
|
||||
"""Test basic email configuration initialization."""
|
||||
config = {
|
||||
"smtp": {
|
||||
"host": "smtp.example.com",
|
||||
"port": 587,
|
||||
"username": "test@example.com",
|
||||
"password": "password123",
|
||||
"use_tls": True,
|
||||
"use_ssl": False,
|
||||
},
|
||||
"from_address": "sender@example.com",
|
||||
"from_name": "Test Sender",
|
||||
}
|
||||
|
||||
email_config = EmailConfig(config)
|
||||
|
||||
assert email_config.smtp_host == "smtp.example.com"
|
||||
assert email_config.smtp_port == 587
|
||||
assert email_config.smtp_username == "test@example.com"
|
||||
assert email_config.smtp_password == "password123"
|
||||
assert email_config.use_tls is True
|
||||
assert email_config.use_ssl is False
|
||||
assert email_config.from_address == "sender@example.com"
|
||||
assert email_config.from_name == "Test Sender"
|
||||
|
||||
def test_email_config_defaults(self):
|
||||
"""Test email configuration with default values."""
|
||||
config = {}
|
||||
|
||||
email_config = EmailConfig(config)
|
||||
|
||||
assert email_config.smtp_host == "localhost"
|
||||
assert email_config.smtp_port == 587
|
||||
assert email_config.use_tls is True
|
||||
assert email_config.use_ssl is False
|
||||
assert email_config.from_address == "noreply@example.com"
|
||||
|
||||
def test_email_config_tls_ssl_conflict(self):
|
||||
"""Test that TLS and SSL cannot both be enabled."""
|
||||
config = {
|
||||
"smtp": {
|
||||
"use_tls": True,
|
||||
"use_ssl": True,
|
||||
}
|
||||
}
|
||||
|
||||
with pytest.raises(ValueError, match="Cannot use both TLS and SSL"):
|
||||
EmailConfig(config)
|
||||
|
||||
|
||||
class TestEmailService:
|
||||
"""Tests for EmailService class."""
|
||||
|
||||
@pytest.fixture
|
||||
def email_config(self):
|
||||
"""Provide a test email configuration."""
|
||||
return EmailConfig(
|
||||
{
|
||||
"smtp": {
|
||||
"host": "smtp.example.com",
|
||||
"port": 587,
|
||||
"username": "test@example.com",
|
||||
"password": "password123",
|
||||
"use_tls": True,
|
||||
},
|
||||
"from_address": "sender@example.com",
|
||||
"from_name": "Test Sender",
|
||||
}
|
||||
)
|
||||
|
||||
@pytest.fixture
|
||||
def email_service(self, email_config):
|
||||
"""Provide an EmailService instance."""
|
||||
return EmailService(email_config)
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_send_email_success(self, email_service):
|
||||
"""Test successful email sending."""
|
||||
with patch.object(email_service, "_send_smtp") as mock_smtp:
|
||||
result = await email_service.send_email(
|
||||
recipients=["test@example.com"],
|
||||
subject="Test Subject",
|
||||
body="Test body",
|
||||
)
|
||||
|
||||
assert result is True
|
||||
assert mock_smtp.called
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_send_email_no_recipients(self, email_service):
|
||||
"""Test email sending with no recipients."""
|
||||
result = await email_service.send_email(
|
||||
recipients=[],
|
||||
subject="Test Subject",
|
||||
body="Test body",
|
||||
)
|
||||
|
||||
assert result is False
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_send_email_with_html(self, email_service):
|
||||
"""Test email sending with HTML body."""
|
||||
with patch.object(email_service, "_send_smtp") as mock_smtp:
|
||||
result = await email_service.send_email(
|
||||
recipients=["test@example.com"],
|
||||
subject="Test Subject",
|
||||
body="Plain text body",
|
||||
html_body="<html><body>HTML body</body></html>",
|
||||
)
|
||||
|
||||
assert result is True
|
||||
assert mock_smtp.called
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_send_alert(self, email_service):
|
||||
"""Test send_alert convenience method."""
|
||||
with patch.object(
|
||||
email_service, "send_email", new_callable=AsyncMock
|
||||
) as mock_send:
|
||||
mock_send.return_value = True
|
||||
|
||||
result = await email_service.send_alert(
|
||||
recipients=["test@example.com"],
|
||||
subject="Alert Subject",
|
||||
body="Alert body",
|
||||
)
|
||||
|
||||
assert result is True
|
||||
mock_send.assert_called_once_with(
|
||||
["test@example.com"], "Alert Subject", "Alert body"
|
||||
)
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_send_daily_report(self, email_service):
|
||||
"""Test daily report email generation and sending."""
|
||||
with patch.object(
|
||||
email_service, "send_email", new_callable=AsyncMock
|
||||
) as mock_send:
|
||||
mock_send.return_value = True
|
||||
|
||||
stats = {
|
||||
"total_reservations": 42,
|
||||
"new_customers": 15,
|
||||
}
|
||||
|
||||
errors = [
|
||||
{
|
||||
"timestamp": "2025-10-15 10:30:00",
|
||||
"level": "ERROR",
|
||||
"message": "Test error message",
|
||||
}
|
||||
]
|
||||
|
||||
result = await email_service.send_daily_report(
|
||||
recipients=["admin@example.com"],
|
||||
stats=stats,
|
||||
errors=errors,
|
||||
)
|
||||
|
||||
assert result is True
|
||||
assert mock_send.called
|
||||
call_args = mock_send.call_args
|
||||
assert "admin@example.com" in call_args[0][0]
|
||||
assert "Daily Report" in call_args[0][1]
|
||||
|
||||
|
||||
class TestErrorRecord:
|
||||
"""Tests for ErrorRecord class."""
|
||||
|
||||
def test_error_record_creation(self):
|
||||
"""Test creating an ErrorRecord from a logging record."""
|
||||
log_record = logging.LogRecord(
|
||||
name="test.logger",
|
||||
level=logging.ERROR,
|
||||
pathname="/path/to/file.py",
|
||||
lineno=42,
|
||||
msg="Test error message",
|
||||
args=(),
|
||||
exc_info=None,
|
||||
)
|
||||
|
||||
error_record = ErrorRecord(log_record)
|
||||
|
||||
assert error_record.level == "ERROR"
|
||||
assert error_record.logger_name == "test.logger"
|
||||
assert error_record.message == "Test error message"
|
||||
assert error_record.module == "file"
|
||||
assert error_record.line_no == 42
|
||||
|
||||
def test_error_record_to_dict(self):
|
||||
"""Test converting ErrorRecord to dictionary."""
|
||||
log_record = logging.LogRecord(
|
||||
name="test.logger",
|
||||
level=logging.ERROR,
|
||||
pathname="/path/to/file.py",
|
||||
lineno=42,
|
||||
msg="Test error",
|
||||
args=(),
|
||||
exc_info=None,
|
||||
)
|
||||
|
||||
error_record = ErrorRecord(log_record)
|
||||
error_dict = error_record.to_dict()
|
||||
|
||||
assert error_dict["level"] == "ERROR"
|
||||
assert error_dict["message"] == "Test error"
|
||||
assert error_dict["line_no"] == 42
|
||||
assert "timestamp" in error_dict
|
||||
|
||||
def test_error_record_format_plain_text(self):
|
||||
"""Test formatting ErrorRecord as plain text."""
|
||||
log_record = logging.LogRecord(
|
||||
name="test.logger",
|
||||
level=logging.ERROR,
|
||||
pathname="/path/to/file.py",
|
||||
lineno=42,
|
||||
msg="Test error",
|
||||
args=(),
|
||||
exc_info=None,
|
||||
)
|
||||
|
||||
error_record = ErrorRecord(log_record)
|
||||
formatted = error_record.format_plain_text()
|
||||
|
||||
assert "ERROR" in formatted
|
||||
assert "Test error" in formatted
|
||||
assert "file:42" in formatted
|
||||
|
||||
|
||||
class TestEmailAlertHandler:
|
||||
"""Tests for EmailAlertHandler class."""
|
||||
|
||||
@pytest.fixture
|
||||
def mock_email_service(self):
|
||||
"""Provide a mock email service."""
|
||||
service = MagicMock(spec=EmailService)
|
||||
service.send_alert = AsyncMock(return_value=True)
|
||||
return service
|
||||
|
||||
@pytest.fixture
|
||||
def handler_config(self):
|
||||
"""Provide handler configuration."""
|
||||
return {
|
||||
"recipients": ["alert@example.com"],
|
||||
"error_threshold": 3,
|
||||
"buffer_minutes": 1, # Short for testing
|
||||
"cooldown_minutes": 5,
|
||||
"log_levels": ["ERROR", "CRITICAL"],
|
||||
}
|
||||
|
||||
@pytest.fixture
|
||||
def alert_handler(self, mock_email_service, handler_config):
|
||||
"""Provide an EmailAlertHandler instance."""
|
||||
loop = asyncio.new_event_loop()
|
||||
handler = EmailAlertHandler(mock_email_service, handler_config, loop)
|
||||
yield handler
|
||||
loop.close()
|
||||
|
||||
def test_handler_initialization(self, alert_handler, handler_config):
|
||||
"""Test handler initialization."""
|
||||
assert alert_handler.error_threshold == 3
|
||||
assert alert_handler.buffer_minutes == 1
|
||||
assert alert_handler.cooldown_minutes == 5
|
||||
assert alert_handler.recipients == ["alert@example.com"]
|
||||
|
||||
def test_handler_ignores_non_error_levels(self, alert_handler):
|
||||
"""Test that handler ignores INFO and WARNING levels."""
|
||||
log_record = logging.LogRecord(
|
||||
name="test",
|
||||
level=logging.INFO,
|
||||
pathname="/test.py",
|
||||
lineno=1,
|
||||
msg="Info message",
|
||||
args=(),
|
||||
exc_info=None,
|
||||
)
|
||||
|
||||
alert_handler.emit(log_record)
|
||||
|
||||
# Should not buffer INFO messages
|
||||
assert len(alert_handler.error_buffer) == 0
|
||||
|
||||
|
||||
class TestDailyReportScheduler:
|
||||
"""Tests for DailyReportScheduler class."""
|
||||
|
||||
@pytest.fixture
|
||||
def mock_email_service(self):
|
||||
"""Provide a mock email service."""
|
||||
service = MagicMock(spec=EmailService)
|
||||
service.send_daily_report = AsyncMock(return_value=True)
|
||||
return service
|
||||
|
||||
@pytest.fixture
|
||||
def scheduler_config(self):
|
||||
"""Provide scheduler configuration."""
|
||||
return {
|
||||
"recipients": ["report@example.com"],
|
||||
"send_time": "08:00",
|
||||
"include_stats": True,
|
||||
"include_errors": True,
|
||||
}
|
||||
|
||||
@pytest.fixture
|
||||
def scheduler(self, mock_email_service, scheduler_config):
|
||||
"""Provide a DailyReportScheduler instance."""
|
||||
return DailyReportScheduler(mock_email_service, scheduler_config)
|
||||
|
||||
def test_scheduler_initialization(self, scheduler, scheduler_config):
|
||||
"""Test scheduler initialization."""
|
||||
assert scheduler.send_time == "08:00"
|
||||
assert scheduler.recipients == ["report@example.com"]
|
||||
assert scheduler.include_stats is True
|
||||
assert scheduler.include_errors is True
|
||||
|
||||
def test_scheduler_log_error(self, scheduler):
|
||||
"""Test logging errors for daily report."""
|
||||
error = {
|
||||
"timestamp": datetime.now().isoformat(),
|
||||
"level": "ERROR",
|
||||
"message": "Test error",
|
||||
}
|
||||
|
||||
scheduler.log_error(error)
|
||||
|
||||
assert len(scheduler._error_log) == 1
|
||||
assert scheduler._error_log[0]["message"] == "Test error"
|
||||
|
||||
def test_scheduler_set_stats_collector(self, scheduler):
|
||||
"""Test setting stats collector function."""
|
||||
|
||||
async def mock_collector():
|
||||
return {"test": "stats"}
|
||||
|
||||
scheduler.set_stats_collector(mock_collector)
|
||||
|
||||
assert scheduler._stats_collector is mock_collector
|
||||
62
tests/test_worker_coordination.py
Normal file
62
tests/test_worker_coordination.py
Normal file
@@ -0,0 +1,62 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Test script to verify worker coordination with file locking.
|
||||
|
||||
This simulates multiple workers trying to acquire the primary worker lock.
|
||||
"""
|
||||
|
||||
import multiprocessing
|
||||
import time
|
||||
from pathlib import Path
|
||||
|
||||
from src.alpine_bits_python.worker_coordination import WorkerLock
|
||||
|
||||
|
||||
def worker_process(worker_id: int, lock_file: str):
|
||||
"""Simulate a worker process trying to acquire the lock."""
|
||||
print(f"Worker {worker_id} (PID {multiprocessing.current_process().pid}): Starting")
|
||||
|
||||
lock = WorkerLock(lock_file)
|
||||
is_primary = lock.acquire()
|
||||
|
||||
if is_primary:
|
||||
print(f"Worker {worker_id} (PID {multiprocessing.current_process().pid}): ✓ I am PRIMARY")
|
||||
# Simulate running singleton services
|
||||
time.sleep(3)
|
||||
print(f"Worker {worker_id} (PID {multiprocessing.current_process().pid}): Releasing lock")
|
||||
lock.release()
|
||||
else:
|
||||
print(f"Worker {worker_id} (PID {multiprocessing.current_process().pid}): ✗ I am SECONDARY")
|
||||
# Simulate regular worker work
|
||||
time.sleep(3)
|
||||
|
||||
print(f"Worker {worker_id} (PID {multiprocessing.current_process().pid}): Exiting")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
# Use a test lock file
|
||||
lock_file = "/tmp/test_alpinebits_worker.lock"
|
||||
|
||||
# Clean up any existing lock file
|
||||
Path(lock_file).unlink(missing_ok=True)
|
||||
|
||||
print("Starting 4 worker processes (simulating uvicorn --workers 4)")
|
||||
print("=" * 70)
|
||||
|
||||
# Start multiple workers
|
||||
processes = []
|
||||
for i in range(4):
|
||||
p = multiprocessing.Process(target=worker_process, args=(i, lock_file))
|
||||
p.start()
|
||||
processes.append(p)
|
||||
# Small delay to make output clearer
|
||||
time.sleep(0.1)
|
||||
|
||||
# Wait for all workers to complete
|
||||
for p in processes:
|
||||
p.join()
|
||||
|
||||
print("=" * 70)
|
||||
print("✓ Test complete: Only ONE worker should have been PRIMARY")
|
||||
|
||||
# Clean up
|
||||
Path(lock_file).unlink(missing_ok=True)
|
||||
118
uv.lock
generated
118
uv.lock
generated
@@ -21,11 +21,14 @@ source = { editable = "." }
|
||||
dependencies = [
|
||||
{ name = "aiosqlite" },
|
||||
{ name = "annotatedyaml" },
|
||||
{ name = "asyncpg" },
|
||||
{ name = "dotenv" },
|
||||
{ name = "fast-langdetect" },
|
||||
{ name = "fastapi" },
|
||||
{ name = "generateds" },
|
||||
{ name = "httpx" },
|
||||
{ name = "lxml" },
|
||||
{ name = "pushover-complete" },
|
||||
{ name = "pydantic", extra = ["email"] },
|
||||
{ name = "pytest" },
|
||||
{ name = "pytest-asyncio" },
|
||||
@@ -48,11 +51,14 @@ dev = [
|
||||
requires-dist = [
|
||||
{ name = "aiosqlite", specifier = ">=0.21.0" },
|
||||
{ name = "annotatedyaml", specifier = ">=1.0.0" },
|
||||
{ name = "asyncpg", specifier = ">=0.30.0" },
|
||||
{ name = "dotenv", specifier = ">=0.9.9" },
|
||||
{ name = "fast-langdetect", specifier = ">=1.0.0" },
|
||||
{ name = "fastapi", specifier = ">=0.117.1" },
|
||||
{ name = "generateds", specifier = ">=2.44.3" },
|
||||
{ name = "httpx", specifier = ">=0.28.1" },
|
||||
{ name = "lxml", specifier = ">=6.0.1" },
|
||||
{ name = "pushover-complete", specifier = ">=2.0.0" },
|
||||
{ name = "pydantic", extras = ["email"], specifier = ">=2.11.9" },
|
||||
{ name = "pytest", specifier = ">=8.4.2" },
|
||||
{ name = "pytest-asyncio", specifier = ">=1.2.0" },
|
||||
@@ -131,6 +137,22 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/15/b3/9b1a8074496371342ec1e796a96f99c82c945a339cd81a8e73de28b4cf9e/anyio-4.11.0-py3-none-any.whl", hash = "sha256:0287e96f4d26d4149305414d4e3bc32f0dcd0862365a4bddea19d7a1ec38c4fc", size = 109097, upload-time = "2025-09-23T09:19:10.601Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "asyncpg"
|
||||
version = "0.30.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/2f/4c/7c991e080e106d854809030d8584e15b2e996e26f16aee6d757e387bc17d/asyncpg-0.30.0.tar.gz", hash = "sha256:c551e9928ab6707602f44811817f82ba3c446e018bfe1d3abecc8ba5f3eac851", size = 957746, upload-time = "2024-10-20T00:30:41.127Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/3a/22/e20602e1218dc07692acf70d5b902be820168d6282e69ef0d3cb920dc36f/asyncpg-0.30.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:05b185ebb8083c8568ea8a40e896d5f7af4b8554b64d7719c0eaa1eb5a5c3a70", size = 670373, upload-time = "2024-10-20T00:29:55.165Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/3d/b3/0cf269a9d647852a95c06eb00b815d0b95a4eb4b55aa2d6ba680971733b9/asyncpg-0.30.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:c47806b1a8cbb0a0db896f4cd34d89942effe353a5035c62734ab13b9f938da3", size = 634745, upload-time = "2024-10-20T00:29:57.14Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/8e/6d/a4f31bf358ce8491d2a31bfe0d7bcf25269e80481e49de4d8616c4295a34/asyncpg-0.30.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9b6fde867a74e8c76c71e2f64f80c64c0f3163e687f1763cfaf21633ec24ec33", size = 3512103, upload-time = "2024-10-20T00:29:58.499Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/96/19/139227a6e67f407b9c386cb594d9628c6c78c9024f26df87c912fabd4368/asyncpg-0.30.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:46973045b567972128a27d40001124fbc821c87a6cade040cfcd4fa8a30bcdc4", size = 3592471, upload-time = "2024-10-20T00:30:00.354Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/67/e4/ab3ca38f628f53f0fd28d3ff20edff1c975dd1cb22482e0061916b4b9a74/asyncpg-0.30.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:9110df111cabc2ed81aad2f35394a00cadf4f2e0635603db6ebbd0fc896f46a4", size = 3496253, upload-time = "2024-10-20T00:30:02.794Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/ef/5f/0bf65511d4eeac3a1f41c54034a492515a707c6edbc642174ae79034d3ba/asyncpg-0.30.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:04ff0785ae7eed6cc138e73fc67b8e51d54ee7a3ce9b63666ce55a0bf095f7ba", size = 3662720, upload-time = "2024-10-20T00:30:04.501Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e7/31/1513d5a6412b98052c3ed9158d783b1e09d0910f51fbe0e05f56cc370bc4/asyncpg-0.30.0-cp313-cp313-win32.whl", hash = "sha256:ae374585f51c2b444510cdf3595b97ece4f233fde739aa14b50e0d64e8a7a590", size = 560404, upload-time = "2024-10-20T00:30:06.537Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/c8/a4/cec76b3389c4c5ff66301cd100fe88c318563ec8a520e0b2e792b5b84972/asyncpg-0.30.0-cp313-cp313-win_amd64.whl", hash = "sha256:f59b430b8e27557c3fb9869222559f7417ced18688375825f8f12302c34e915e", size = 621623, upload-time = "2024-10-20T00:30:09.024Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "certifi"
|
||||
version = "2025.8.3"
|
||||
@@ -204,6 +226,18 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/d1/d6/3965ed04c63042e047cb6a3e6ed1a63a35087b6a609aa3a15ed8ac56c221/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6", size = 25335, upload-time = "2022-10-25T02:36:20.889Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "colorlog"
|
||||
version = "6.9.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "colorama", marker = "sys_platform == 'win32'" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/d3/7a/359f4d5df2353f26172b3cc39ea32daa39af8de522205f512f458923e677/colorlog-6.9.0.tar.gz", hash = "sha256:bfba54a1b93b94f54e1f4fe48395725a3d92fd2a4af702f6bd70946bdc0c6ac2", size = 16624, upload-time = "2024-10-29T18:34:51.011Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/e3/51/9b208e85196941db2f0654ad0357ca6388ab3ed67efdbfc799f35d1f83aa/colorlog-6.9.0-py3-none-any.whl", hash = "sha256:5906e71acd67cb07a71e779c47c4bcb45fb8c2993eebe9e5adcd6a6f1b283eff", size = 11424, upload-time = "2024-10-29T18:34:49.815Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "coverage"
|
||||
version = "7.10.7"
|
||||
@@ -323,6 +357,20 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/de/15/545e2b6cf2e3be84bc1ed85613edd75b8aea69807a71c26f4ca6a9258e82/email_validator-2.3.0-py3-none-any.whl", hash = "sha256:80f13f623413e6b197ae73bb10bf4eb0908faf509ad8362c5edeb0be7fd450b4", size = 35604, upload-time = "2025-08-26T13:09:05.858Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "fast-langdetect"
|
||||
version = "1.0.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "fasttext-predict" },
|
||||
{ name = "requests" },
|
||||
{ name = "robust-downloader" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/53/15/85b0137066be418b6249d8e8d98e2b16c072c65b80c293b9438fdea1be5e/fast_langdetect-1.0.0.tar.gz", hash = "sha256:ea8ac6a8914e0ff1bfc1bbc0f25992eb913ddb69e63ea1b24e907e263d0cd113", size = 796192, upload-time = "2025-09-17T06:32:26.86Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/f6/71/0db1ac89f8661048ebc22d62f503a2e147cb6872c5f2aeb659c1f02c1694/fast_langdetect-1.0.0-py3-none-any.whl", hash = "sha256:aab9e3435cc667ac8ba8b1a38872f75492f65b7087901d0f3a02a88d436cd22a", size = 789944, upload-time = "2025-09-17T06:32:25.363Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "fastapi"
|
||||
version = "0.117.1"
|
||||
@@ -337,6 +385,38 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/6d/45/d9d3e8eeefbe93be1c50060a9d9a9f366dba66f288bb518a9566a23a8631/fastapi-0.117.1-py3-none-any.whl", hash = "sha256:33c51a0d21cab2b9722d4e56dbb9316f3687155be6b276191790d8da03507552", size = 95959, upload-time = "2025-09-20T20:16:53.661Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "fasttext-predict"
|
||||
version = "0.9.2.4"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/fc/0e/9defbb9385bcb1104cc1d686a14f7d9fafe5fe43f220cccb00f33d91bb47/fasttext_predict-0.9.2.4.tar.gz", hash = "sha256:18a6fb0d74c7df9280db1f96cb75d990bfd004fa9d669493ea3dd3d54f84dbc7", size = 16332, upload-time = "2024-11-23T17:24:44.801Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/89/fc/5cd65224c33e33d6faec3fa1047162dc266ed2213016139d936bd36fb7c3/fasttext_predict-0.9.2.4-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:ddb85e62c95e4e02d417c782e3434ef65554df19e3522f5230f6be15a9373c05", size = 104916, upload-time = "2024-11-23T17:23:43.367Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/d9/53/8d542773e32c9d98dd8c680e390fe7e6d4fc92ab3439dc1bb8e70c46c7ad/fasttext_predict-0.9.2.4-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:102129d45cf98dda871e83ae662f71d999b9ef6ff26bc842ffc1520a1f82930c", size = 97502, upload-time = "2024-11-23T17:23:44.447Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/50/99/049fd6b01937705889bd9a00c31e5c55f0ae4b7704007b2ef7a82bf2b867/fasttext_predict-0.9.2.4-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:05ba6a0fbf8cb2141b8ca2bc461db97af8ac31a62341e4696a75048b9de39e10", size = 282951, upload-time = "2024-11-23T17:23:46.31Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/83/cb/79b71709edbb53c3c5f8a8b60fe2d3bc98d28a8e75367c89afedf3307aa9/fasttext_predict-0.9.2.4-cp313-cp313-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0c7a779215571296ecfcf86545cb30ec3f1c6f43cbcd69f83cc4f67049375ea1", size = 307377, upload-time = "2024-11-23T17:23:47.685Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/7c/4a/b15b7be003e76613173cc77d9c6cce4bf086073079354e0177deaa768f59/fasttext_predict-0.9.2.4-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ddd2f03f3f206585543f5274b1dbc5f651bae141a1b14c9d5225c2a12e5075c2", size = 295746, upload-time = "2024-11-23T17:23:49.024Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/e3/d3/f030cd45bdd4b052fcf23e730fdf0804e024b0cad43d7c7f8704faaec2f5/fasttext_predict-0.9.2.4-cp313-cp313-manylinux_2_31_armv7l.whl", hash = "sha256:748f9edc3222a1fb7a61331c4e06d3b7f2390ae493f91f09d372a00b81762a8d", size = 236939, upload-time = "2024-11-23T17:23:50.306Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a2/01/6f2985afd58fdc5f4ecd058d5d9427d03081d468960982df97316c03f6bb/fasttext_predict-0.9.2.4-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:1aee47a40757cd24272b34eaf9ceeea86577fd0761b0fd0e41599c6549abdf04", size = 1214189, upload-time = "2024-11-23T17:23:51.647Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/75/07/931bcdd4e2406e45e54d57e056c2e0766616a5280a18fbf6ef078aa439ab/fasttext_predict-0.9.2.4-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:6ff0f152391ee03ffc18495322100c01735224f7843533a7c4ff33c8853d7be1", size = 1099889, upload-time = "2024-11-23T17:23:53.127Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a2/eb/6521b4bbf387252a96a6dc0f54986f078a93db0a9d4ba77258dcf1fa8be7/fasttext_predict-0.9.2.4-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:4d92f5265318b41d6e68659fd459babbff692484e492c5013995b90a56b517c9", size = 1383959, upload-time = "2024-11-23T17:23:54.521Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/b7/6b/d56606761afb3a3912c52971f0f804e2e9065f049c412b96c47d6fca6218/fasttext_predict-0.9.2.4-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:3a7720cce1b8689d88df76cac1425e84f9911c69a4e40a5309d7d3435e1bb97c", size = 1281097, upload-time = "2024-11-23T17:23:55.9Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/91/83/55bb4a37bb3b3a428941f4e1323c345a662254f576f8860b3098d9742510/fasttext_predict-0.9.2.4-cp313-cp313-win32.whl", hash = "sha256:d16acfced7871ed0cd55b476f0dbdddc7a5da1ffc9745a3c5674846cf1555886", size = 91137, upload-time = "2024-11-23T17:23:57.886Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/9c/1d/c1ccc8790ce54200c84164d99282f088dddb9760aeefc8860856aafa40b4/fasttext_predict-0.9.2.4-cp313-cp313-win_amd64.whl", hash = "sha256:96a23328729ce62a851f8953582e576ca075ee78d637df4a78a2b3609784849e", size = 104896, upload-time = "2024-11-23T17:23:59.028Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/a4/c9/a1ccc749c59e2480767645ecc03bd842a7fa5b2b780d69ac370e6f8298d2/fasttext_predict-0.9.2.4-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:b1357d0d9d8568db84668b57e7c6880b9c46f757e8954ad37634402d36f09dba", size = 109401, upload-time = "2024-11-23T17:24:00.191Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/90/1f/33182b76eb0524155e8ff93e7939feaf5325385e5ff2a154f383d9a02317/fasttext_predict-0.9.2.4-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:9604c464c5d86c7eba34b040080be7012e246ef512b819e428b7deb817290dae", size = 102131, upload-time = "2024-11-23T17:24:02.052Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/2b/df/1886daea373382e573f28ce49e3fc8fb6b0ee0c84e2b0becf5b254cd93fb/fasttext_predict-0.9.2.4-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cc6da186c2e4497cbfaba9c5424e58c7b72728b25d980829eb96daccd7cface1", size = 287396, upload-time = "2024-11-23T17:24:03.294Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/35/8f/d1c2c0f0251bee898d508253a437683b0480a1074cfb25ded1f7fdbb925a/fasttext_predict-0.9.2.4-cp313-cp313t-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:366ed2ca4f4170418f3585e92059cf17ee2c963bf179111c5b8ba48f06cd69d1", size = 311090, upload-time = "2024-11-23T17:24:04.625Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/5d/52/07d6ed46148662fae84166bc69d944caca87fabc850ebfbd9640b20dafe7/fasttext_predict-0.9.2.4-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2f1877edbb815a43e7d38cc7332202e759054cf0b5a4b7e34a743c0f5d6e7333", size = 300359, upload-time = "2024-11-23T17:24:06.486Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/fa/a1/751ff471a991e5ed0bae9e7fa6fc8d8ab76b233a7838a27d70d62bed0c8e/fasttext_predict-0.9.2.4-cp313-cp313t-manylinux_2_31_armv7l.whl", hash = "sha256:f63c31352ba6fc910290b0fe12733770acd8cfa0945fcb9cf3984d241abcfc9d", size = 241164, upload-time = "2024-11-23T17:24:08.501Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/94/19/e251f699a0e9c001fa672ea0929c456160faa68ecfafc19e8def09982b6a/fasttext_predict-0.9.2.4-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:898e14b03fbfb0a8d9a5185a0a00ff656772b3baa37cad122e06e8e4d6da3832", size = 1218629, upload-time = "2024-11-23T17:24:10.04Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/1d/46/1af2f779f8cfd746496a226581f747d3051888e3e3c5b2ca37231e5d04f8/fasttext_predict-0.9.2.4-cp313-cp313t-musllinux_1_2_armv7l.whl", hash = "sha256:a33bb5832a69fc54d18cadcf015677c1acb5ccc7f0125d261df2a89f8aff01f6", size = 1100535, upload-time = "2024-11-23T17:24:11.5Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/4c/b7/900ccd74a9ba8be7ca6d04bba684e9c43fb0dbed8a3d12ec0536228e2c32/fasttext_predict-0.9.2.4-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:7fe9e98bd0701d598bf245eb2fbf592145cd03551684a2102a4b301294b9bd87", size = 1387651, upload-time = "2024-11-23T17:24:13.135Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/0b/5a/99fdaed054079f7c96e70df0d7016c4eb6b9e487a614396dd8f849244a52/fasttext_predict-0.9.2.4-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:dcb8c5a74c1785f005fd83d445137437b79ac70a2dfbfe4bb1b09aa5643be545", size = 1286189, upload-time = "2024-11-23T17:24:14.615Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/87/6a/9114d65b3f7a9c20a62b9d2ca3b770ee65de849e4131cc7aa58cdc50cb07/fasttext_predict-0.9.2.4-cp313-cp313t-win32.whl", hash = "sha256:a85c7de3d4480faa12b930637fca9c23144d1520786fedf9ba8edd8642ed4aea", size = 95905, upload-time = "2024-11-23T17:24:15.868Z" },
|
||||
{ url = "https://files.pythonhosted.org/packages/31/fb/6d251f3fdfe3346ee60d091f55106513e509659ee005ad39c914182c96f4/fasttext_predict-0.9.2.4-cp313-cp313t-win_amd64.whl", hash = "sha256:be0933fa4af7abae09c703d28f9e17c80e7069eb6f92100b21985b777f4ea275", size = 110325, upload-time = "2024-11-23T17:24:16.984Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "generateds"
|
||||
version = "2.44.3"
|
||||
@@ -586,6 +666,18 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/cc/35/cc0aaecf278bb4575b8555f2b137de5ab821595ddae9da9d3cd1da4072c7/propcache-0.3.2-py3-none-any.whl", hash = "sha256:98f1ec44fb675f5052cccc8e609c46ed23a35a1cfd18545ad4e29002d858a43f", size = 12663, upload-time = "2025-06-09T22:56:04.484Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pushover-complete"
|
||||
version = "2.0.0"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "requests" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/db/ae/2ed5c277e22316d8a31e2f67c6c9fd5021189ed3754e144aad53d874d687/pushover_complete-2.0.0.tar.gz", hash = "sha256:24fc7d84d73426840e7678fee80d36f40df0114cb30352ba4f99ab3842ed21a7", size = 19035, upload-time = "2025-05-20T12:47:59.464Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/9e/c2/7debacdeb30d5956e5c5573f129ea2a422eeaaba8993ddfc61c9c0e54c95/pushover_complete-2.0.0-py3-none-any.whl", hash = "sha256:9dbb540daf86b26375e0aaa4b798ad5936b27047ee82cf3213bafeee96929527", size = 9952, upload-time = "2025-05-20T12:47:58.248Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "pydantic"
|
||||
version = "2.11.9"
|
||||
@@ -754,6 +846,20 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/1e/db/4254e3eabe8020b458f1a747140d32277ec7a271daf1d235b70dc0b4e6e3/requests-2.32.5-py3-none-any.whl", hash = "sha256:2462f94637a34fd532264295e186976db0f5d453d1cdd31473c85a6a161affb6", size = 64738, upload-time = "2025-08-18T20:46:00.542Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "robust-downloader"
|
||||
version = "0.0.2"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "colorlog" },
|
||||
{ name = "requests" },
|
||||
{ name = "tqdm" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/63/20/8d28efa080f58fa06f6378875ac482ee511c076369e5293a2e65128cf9a0/robust-downloader-0.0.2.tar.gz", hash = "sha256:08c938b96e317abe6b037e34230a91bda9b5d613f009bca4a47664997c61de90", size = 15785, upload-time = "2023-11-13T03:00:20.637Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/56/a1/779e9d0ebbdc704411ce30915a1105eb01aeaa9e402d7e446613ff8fb121/robust_downloader-0.0.2-py3-none-any.whl", hash = "sha256:8fe08bfb64d714fd1a048a7df6eb7b413eb4e624309a49db2c16fbb80a62869d", size = 15534, upload-time = "2023-11-13T03:00:18.957Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "ruff"
|
||||
version = "0.13.1"
|
||||
@@ -852,6 +958,18 @@ wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/f6/17/57b444fd314d5e1593350b9a31d000e7411ba8e17ce12dc7ad54ca76b810/toposort-1.10-py3-none-any.whl", hash = "sha256:cbdbc0d0bee4d2695ab2ceec97fe0679e9c10eab4b2a87a9372b929e70563a87", size = 8500, upload-time = "2023-02-25T20:07:06.538Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "tqdm"
|
||||
version = "4.67.1"
|
||||
source = { registry = "https://pypi.org/simple" }
|
||||
dependencies = [
|
||||
{ name = "colorama", marker = "sys_platform == 'win32'" },
|
||||
]
|
||||
sdist = { url = "https://files.pythonhosted.org/packages/a8/4b/29b4ef32e036bb34e4ab51796dd745cdba7ed47ad142a9f4a1eb8e0c744d/tqdm-4.67.1.tar.gz", hash = "sha256:f8aef9c52c08c13a65f30ea34f4e5aac3fd1a34959879d7e59e63027286627f2", size = 169737, upload-time = "2024-11-24T20:12:22.481Z" }
|
||||
wheels = [
|
||||
{ url = "https://files.pythonhosted.org/packages/d0/30/dc54f88dd4a2b5dc8a0279bdd7270e735851848b762aeb1c1184ed1f6b14/tqdm-4.67.1-py3-none-any.whl", hash = "sha256:26445eca388f82e72884e0d580d5464cd801a3ea01e63e5601bdff9ba6a48de2", size = 78540, upload-time = "2024-11-24T20:12:19.698Z" },
|
||||
]
|
||||
|
||||
[[package]]
|
||||
name = "typing-extensions"
|
||||
version = "4.15.0"
|
||||
|
||||
Reference in New Issue
Block a user