db_modeling_for_capi #5

Merged
jonas merged 23 commits from db_modeling_for_capi into main 2025-10-10 14:57:52 +00:00
8 changed files with 234 additions and 12 deletions
Showing only changes of commit a325a443f7 - Show all commits

View File

@@ -26,6 +26,10 @@ Data flows: Wix form → Database → AlpineBits XML → Hotel systems (pull or
- Default config location: `config/config.yaml` + `config/secrets.yaml` - Default config location: `config/config.yaml` + `config/secrets.yaml`
- Override via `ALPINEBITS_CONFIG_DIR` environment variable - Override via `ALPINEBITS_CONFIG_DIR` environment variable
- Multi-hotel support: Each hotel in `alpine_bits_auth` array gets own credentials and optional `push_endpoint` - Multi-hotel support: Each hotel in `alpine_bits_auth` array gets own credentials and optional `push_endpoint`
- **Logging**: Centralized logging configured via `logger` section (see `logging_config.py` and `LOGGING.md`)
- Use `from logging_config import get_logger; _LOGGER = get_logger(__name__)` in any module
- Logs to console always; optionally to file if `logger.file` is set
- Format includes timestamp: `%(asctime)s - %(name)s - %(levelname)s - %(message)s`
### Database Layer ### Database Layer

118
LOGGING.md Normal file
View File

@@ -0,0 +1,118 @@
# Logging Configuration
The AlpineBits Python server uses a centralized logging system that can be configured via the `config.yaml` file.
## Configuration
Add the following section to your `config/config.yaml`:
```yaml
logger:
level: "INFO" # Options: DEBUG, INFO, WARNING, ERROR, CRITICAL
file: "logs/alpinebits.log" # Optional: path to log file (omit or set to null for console-only)
```
### Log Levels
- **DEBUG**: Detailed diagnostic information (very verbose)
- **INFO**: General informational messages about application progress
- **WARNING**: Warning messages about potential issues
- **ERROR**: Error messages when something goes wrong
- **CRITICAL**: Critical errors that may cause application failure
### Log Output
- **Console**: Logs are always written to console (stdout)
- **File**: Optionally write logs to a file by specifying the `file` parameter
- File logs include the same timestamp and formatting as console logs
- Log directory will be created automatically if it doesn't exist
## Usage in Code
To use logging in any module:
```python
from alpine_bits_python.logging_config import get_logger
_LOGGER = get_logger(__name__)
# Then use the logger
_LOGGER.info("Application started")
_LOGGER.debug("Detailed debug information: %s", some_variable)
_LOGGER.warning("Something unusual happened")
_LOGGER.error("An error occurred: %s", error_message)
_LOGGER.exception("Critical error with stack trace")
```
## Log Format
All log entries include:
- Timestamp (YYYY-MM-DD HH:MM:SS)
- Module name (logger name)
- Log level
- Message
Example:
```
2025-10-09 14:23:45 - alpine_bits_python.api - INFO - Application startup initiated
2025-10-09 14:23:45 - alpine_bits_python.api - INFO - Logging configured at INFO level
2025-10-09 14:23:46 - alpine_bits_python.api - INFO - Database tables checked/created at startup.
```
## Best Practices
1. **Use structured logging**: Pass variables as arguments, not f-strings
```python
# Good
_LOGGER.info("Processing reservation %s for hotel %s", reservation_id, hotel_code)
# Avoid (performance overhead, linting warnings)
_LOGGER.info(f"Processing reservation {reservation_id} for hotel {hotel_code}")
```
2. **Use appropriate log levels**:
- `DEBUG`: Detailed tracing for development
- `INFO`: Normal application flow events
- `WARNING`: Unexpected but handled situations
- `ERROR`: Errors that need attention
- `CRITICAL`: Severe errors requiring immediate action
3. **Use `exception()` for error handling**:
```python
try:
risky_operation()
except Exception:
_LOGGER.exception("Operation failed") # Automatically includes stack trace
```
4. **Don't log sensitive data**: Avoid logging passwords, tokens, or personal data
## Examples
### Console-only logging (development)
```yaml
logger:
level: "DEBUG"
```
### File logging (production)
```yaml
logger:
level: "INFO"
file: "/var/log/alpinebits/app.log"
```
### Minimal logging
```yaml
logger:
level: "WARNING"
file: "logs/warnings.log"
```

12
alpinebits.log Normal file
View File

@@ -0,0 +1,12 @@
2025-10-09 14:28:51 - root - INFO - Logging to file: alpinebits.log
2025-10-09 14:28:51 - root - INFO - Logging configured at INFO level
2025-10-09 14:28:51 - alpine_bits_python.api - INFO - Application startup initiated
2025-10-09 14:28:51 - alpine_bits_python.alpinebits_server - INFO - Initializing action instance for AlpineBitsActionName.OTA_HOTEL_NOTIF_REPORT
2025-10-09 14:28:51 - alpine_bits_python.alpinebits_server - INFO - Initializing action instance for AlpineBitsActionName.OTA_PING
2025-10-09 14:28:51 - alpine_bits_python.alpinebits_server - INFO - Initializing action instance for AlpineBitsActionName.OTA_HOTEL_RES_NOTIF_GUEST_REQUESTS
2025-10-09 14:28:51 - alpine_bits_python.alpinebits_server - INFO - Initializing action instance for AlpineBitsActionName.OTA_READ
2025-10-09 14:28:51 - alpine_bits_python.api - INFO - Hotel 39054_001 has no push_endpoint configured
2025-10-09 14:28:51 - alpine_bits_python.api - INFO - Hotel 135 has no push_endpoint configured
2025-10-09 14:28:51 - alpine_bits_python.api - INFO - Hotel 39052_001 has no push_endpoint configured
2025-10-09 14:28:51 - alpine_bits_python.api - INFO - Hotel 39040_001 has no push_endpoint configured
2025-10-09 14:28:51 - alpine_bits_python.api - INFO - Database tables checked/created at startup.

View File

@@ -1,4 +1,3 @@
import logging
import traceback import traceback
from dataclasses import dataclass from dataclasses import dataclass
from datetime import UTC from datetime import UTC
@@ -6,6 +5,7 @@ from enum import Enum
from typing import Any from typing import Any
from alpine_bits_python.db import Customer, Reservation from alpine_bits_python.db import Customer, Reservation
from alpine_bits_python.logging_config import get_logger
from alpine_bits_python.schemas import ( from alpine_bits_python.schemas import (
CommentData, CommentData,
CommentListItemData, CommentListItemData,
@@ -25,8 +25,7 @@ from .generated.alpinebits import (
UniqueIdType2, UniqueIdType2,
) )
_LOGGER = logging.getLogger(__name__) _LOGGER = get_logger(__name__)
_LOGGER.setLevel(logging.INFO)
# Define type aliases for the two Customer types # Define type aliases for the two Customer types
NotifCustomer = OtaHotelResNotifRq.HotelReservations.HotelReservation.ResGuests.ResGuest.Profiles.ProfileInfo.Profile.Customer # noqa: E501 NotifCustomer = OtaHotelResNotifRq.HotelReservations.HotelReservation.ResGuests.ResGuest.Profiles.ProfileInfo.Profile.Customer # noqa: E501

View File

@@ -7,7 +7,6 @@ handshaking functionality with configurable supported actions and capabilities.
import inspect import inspect
import json import json
import logging
import re import re
from abc import ABC from abc import ABC
from dataclasses import dataclass from dataclasses import dataclass
@@ -24,6 +23,7 @@ from alpine_bits_python.alpine_bits_helpers import (
create_res_notif_push_message, create_res_notif_push_message,
create_res_retrieve_response, create_res_retrieve_response,
) )
from alpine_bits_python.logging_config import get_logger
from .db import AckedRequest, Customer, Reservation from .db import AckedRequest, Customer, Reservation
from .generated.alpinebits import ( from .generated.alpinebits import (
@@ -36,8 +36,7 @@ from .generated.alpinebits import (
) )
# Configure logging # Configure logging
logging.basicConfig(level=logging.INFO) _LOGGER = get_logger(__name__)
_LOGGER = logging.getLogger(__name__)
class HttpStatusCode(IntEnum): class HttpStatusCode(IntEnum):

View File

@@ -1,7 +1,6 @@
import asyncio import asyncio
import gzip import gzip
import json import json
import logging
import os import os
import urllib.parse import urllib.parse
from collections import defaultdict from collections import defaultdict
@@ -31,6 +30,7 @@ from .config_loader import load_config
from .db import Base, get_database_url from .db import Base, get_database_url
from .db import Customer as DBCustomer from .db import Customer as DBCustomer
from .db import Reservation as DBReservation from .db import Reservation as DBReservation
from .logging_config import get_logger, setup_logging
from .rate_limit import ( from .rate_limit import (
BURST_RATE_LIMIT, BURST_RATE_LIMIT,
DEFAULT_RATE_LIMIT, DEFAULT_RATE_LIMIT,
@@ -40,9 +40,8 @@ from .rate_limit import (
webhook_limiter, webhook_limiter,
) )
# Configure logging # Configure logging - will be reconfigured during lifespan with actual config
logging.basicConfig(level=logging.INFO) _LOGGER = get_logger(__name__)
_LOGGER = logging.getLogger(__name__)
# HTTP Basic auth for AlpineBits # HTTP Basic auth for AlpineBits
security_basic = HTTPBasic() security_basic = HTTPBasic()
@@ -168,6 +167,10 @@ async def lifespan(app: FastAPI):
_LOGGER.exception("Failed to load config: ") _LOGGER.exception("Failed to load config: ")
config = {} config = {}
# Setup logging from config
setup_logging(config)
_LOGGER.info("Application startup initiated")
DATABASE_URL = get_database_url(config) DATABASE_URL = get_database_url(config)
engine = create_async_engine(DATABASE_URL, echo=False) engine = create_async_engine(DATABASE_URL, echo=False)
AsyncSessionLocal = async_sessionmaker(engine, expire_on_commit=False) AsyncSessionLocal = async_sessionmaker(engine, expire_on_commit=False)

View File

@@ -1,6 +1,5 @@
import hashlib import hashlib
import hmac import hmac
import logging
import os import os
import secrets import secrets
@@ -10,8 +9,9 @@ from fastapi.security import HTTPAuthorizationCredentials, HTTPBearer
# Load environment variables from .env file # Load environment variables from .env file
load_dotenv() load_dotenv()
from alpine_bits_python.logging_config import get_logger
logger = logging.getLogger(__name__) logger = get_logger(__name__)
# Security scheme # Security scheme
security = HTTPBearer() security = HTTPBearer()

View File

@@ -0,0 +1,87 @@
"""Centralized logging configuration for AlpineBits application.
This module sets up logging based on config and provides a function to get
loggers from anywhere in the application.
"""
import logging
import sys
from pathlib import Path
def setup_logging(config: dict | None = None):
"""Configure logging based on application config.
Args:
config: Application configuration dict with optional 'logger' section
Logger config format:
logger:
level: "INFO" # DEBUG, INFO, WARNING, ERROR, CRITICAL
file: "alpinebits.log" # Optional, logs to console if not provided
"""
if config is None:
config = {}
logger_config = config.get("logger", {})
level = logger_config.get("level", "INFO").upper()
log_file = logger_config.get("file")
# Convert string level to logging constant
numeric_level = getattr(logging, level, logging.INFO)
# Create formatter with timestamp
formatter = logging.Formatter(
fmt="%(asctime)s - %(name)s - %(levelname)s - %(message)s",
datefmt="%Y-%m-%d %H:%M:%S",
)
# Get root logger
root_logger = logging.getLogger()
root_logger.setLevel(numeric_level)
# Remove existing handlers to avoid duplicates
root_logger.handlers.clear()
# Console handler (always add this)
console_handler = logging.StreamHandler(sys.stdout)
console_handler.setLevel(numeric_level)
console_handler.setFormatter(formatter)
root_logger.addHandler(console_handler)
# File handler (optional)
if log_file:
log_path = Path(log_file)
# Create logs directory if it doesn't exist
if log_path.parent != Path():
log_path.parent.mkdir(parents=True, exist_ok=True)
file_handler = logging.FileHandler(log_file, encoding="utf-8")
file_handler.setLevel(numeric_level)
file_handler.setFormatter(formatter)
root_logger.addHandler(file_handler)
root_logger.info("Logging to file: %s", log_file)
root_logger.info("Logging configured at %s level", level)
def get_logger(name: str) -> logging.Logger:
"""Get a logger instance for the given module name.
Usage:
from alpine_bits_python.logging_config import get_logger
_LOGGER = get_logger(__name__)
_LOGGER.info("Something happened")
Args:
name: Usually __name__ from the calling module
Returns:
Configured logger instance
"""
return logging.getLogger(name)