Updated data collection system

This commit is contained in:
Jonas Linter
2025-10-21 11:55:14 +02:00
parent 0d754846ce
commit 6ba8a0dba2
9 changed files with 1418 additions and 57 deletions

View File

@@ -13,3 +13,10 @@ META_AD_ACCOUNT_ID=act_your_account_id_here
# Run: uv run python src/meta_api_grabber/auth.py
# Option 2: Manually provide a token from https://developers.facebook.com/tools/explorer/
META_ACCESS_TOKEN=your_access_token_here
# Database Configuration (TimescaleDB)
# For local development using docker-compose, use these defaults:
DATABASE_URL=postgresql://meta_user:meta_password@localhost:5432/meta_insights
# For production, update with your actual database credentials:
# DATABASE_URL=postgresql://username:password@host:port/database

155
README.md
View File

@@ -1,70 +1,94 @@
# Meta API Grabber
Async script to grab ad insights data from Meta's Marketing API with conservative rate limiting.
Async data collection system for Meta's Marketing API with TimescaleDB time-series storage and dashboard support.
## Features
- **OAuth2 Authentication** - Automated token generation flow
- **TimescaleDB Integration** - Optimized time-series database for ad metrics
- **Scheduled Collection** - Periodic data grabbing (every 2 hours recommended)
- **Metadata Caching** - Smart caching of accounts, campaigns, and ad sets
- **Async/await architecture** for efficient API calls
- **Conservative rate limiting** (2s between requests, 1 concurrent request)
- **Multi-level insights** - Account, campaign, and ad set data
- **JSON output** with timestamps
- **Configurable date ranges**
- **Dashboard Ready** - Includes Grafana setup for visualization
- **Continuous Aggregates** - Pre-computed hourly/daily rollups
- **Data Compression** - Automatic compression of older data
## Setup
## Quick Start
1. Install dependencies using uv:
### 1. Install Dependencies
```bash
uv sync
```
2. Configure your Meta API credentials:
### 2. Start TimescaleDB
```bash
docker-compose up -d
```
This starts:
- **TimescaleDB** on port 5432 (PostgreSQL-compatible)
- **Grafana** on port 3000 (for dashboards)
### 3. Configure Credentials
```bash
cp .env.example .env
```
3. Edit `.env` and add your App ID, App Secret, and Ad Account ID:
- Get your App credentials from [Meta for Developers](https://developers.facebook.com/)
- Find your ad account ID in Meta Ads Manager (format: `act_1234567890`)
Edit `.env` and add:
- **META_APP_ID** and **META_APP_SECRET** from [Meta for Developers](https://developers.facebook.com/)
- **META_AD_ACCOUNT_ID** from Meta Ads Manager (format: `act_1234567890`)
- **DATABASE_URL** is pre-configured for local Docker setup
4. Get an access token (choose one method):
### 4. Get Access Token
**Option A: OAuth2 Flow (Recommended)**
```bash
uv run python src/meta_api_grabber/auth.py
```
This will:
- Generate an authorization URL
- Walk you through the OAuth2 flow
- Offer to save the access token to `.env` automatically
Follow the prompts to authorize and save your token.
**Option B: Manual Token**
- Get a token from [Graph API Explorer](https://developers.facebook.com/tools/explorer/)
- Add it manually to `.env` as `META_ACCESS_TOKEN`
- Add it to `.env` as `META_ACCESS_TOKEN`
## Usage
Run the insights grabber:
### 5. Start Scheduled Collection
```bash
uv run python src/meta_api_grabber/insights_grabber.py
uv run python src/meta_api_grabber/scheduled_grabber.py
```
This will:
- Fetch insights for the last 7 days
- Grab account-level, campaign-level (top 10), and ad set-level (top 10) data
- Save results to `data/meta_insights_TIMESTAMP.json`
- Collect data every 2 hours using the `today` date preset (recommended by Meta)
- Cache metadata (accounts, campaigns, ad sets) twice daily
- Store time-series data in TimescaleDB
- Use upsert strategy to handle updates
### Authentication Scripts
## Usage Modes
**Get OAuth2 Access Token:**
### 1. Scheduled Collection (Recommended for Dashboards)
```bash
uv run python src/meta_api_grabber/auth.py
uv run python src/meta_api_grabber/scheduled_grabber.py
```
- Runs continuously, collecting data every 2 hours
- Stores data in TimescaleDB for dashboard visualization
- Uses `today` date preset (recommended by Meta)
- Caches metadata to reduce API calls
**Grab Insights Data:**
### 2. One-Time Data Export (JSON)
```bash
uv run python src/meta_api_grabber/insights_grabber.py
```
- Fetches insights for the last 7 days
- Saves to `data/meta_insights_TIMESTAMP.json`
- Good for ad-hoc analysis or testing
### 3. OAuth2 Authentication
```bash
uv run python src/meta_api_grabber/auth.py
```
- Interactive flow to get access token
- Saves token to `.env` automatically
## Data Collected
@@ -84,23 +108,68 @@ uv run python src/meta_api_grabber/insights_grabber.py
- Impressions, clicks, spend
- CTR, CPM
## Database Schema
### Time-Series Tables (Hypertables)
- **account_insights** - Account-level metrics over time
- **campaign_insights** - Campaign-level metrics over time
- **adset_insights** - Ad set level metrics over time
### Metadata Tables (Cached)
- **ad_accounts** - Account metadata
- **campaigns** - Campaign metadata
- **adsets** - Ad set metadata
### Continuous Aggregates
- **account_insights_hourly** - Hourly rollups
- **account_insights_daily** - Daily rollups
### Features
- **Automatic partitioning** by day (chunk_time_interval = 1 day)
- **Compression** for data older than 7 days
- **Indexes** on account_id, campaign_id, adset_id + time
- **Upsert strategy** to handle duplicate/updated data
## Dashboard Setup
### Access Grafana
1. Open http://localhost:3000
2. Login with `admin` / `admin`
3. Add TimescaleDB as data source:
- Type: PostgreSQL
- Host: `timescaledb:5432`
- Database: `meta_insights`
- User: `meta_user`
- Password: `meta_password`
- TLS/SSL Mode: disable
### Example Queries
**Latest Account Metrics:**
```sql
SELECT * FROM latest_account_metrics WHERE account_id = 'act_your_id';
```
**Campaign Performance (Last 24h):**
```sql
SELECT * FROM campaign_performance_24h ORDER BY total_spend DESC;
```
**Hourly Trend:**
```sql
SELECT bucket, avg_impressions, avg_clicks, avg_spend
FROM account_insights_hourly
WHERE account_id = 'act_your_id'
AND bucket >= NOW() - INTERVAL '7 days'
ORDER BY bucket;
```
## Rate Limiting
The script is configured to be very conservative:
- 2 seconds delay between API requests
- Only 1 concurrent request at a time
- Limited to top 10 campaigns and ad sets
The system is configured to be very conservative:
- **2 seconds delay** between API requests
- **Only 1 concurrent request** at a time
- **Top 50 campaigns/adsets** per collection
- **2 hour intervals** between collections
You can adjust these settings in the `MetaInsightsGrabber` class if needed.
## Output
Data is saved to `data/meta_insights_TIMESTAMP.json` with the following structure:
```json
{
"account": { ... },
"campaigns": { ... },
"ad_sets": { ... },
"summary": { ... }
}
```
This ensures you stay well within Meta's API rate limits.

42
docker-compose.yml Normal file
View File

@@ -0,0 +1,42 @@
version: '3.8'
services:
timescaledb:
image: timescale/timescaledb:latest-pg16
container_name: meta_timescaledb
ports:
- "5432:5432"
environment:
POSTGRES_DB: meta_insights
POSTGRES_USER: meta_user
POSTGRES_PASSWORD: meta_password
volumes:
- timescale_data:/var/lib/postgresql/data
- ./src/meta_api_grabber/db_schema.sql:/docker-entrypoint-initdb.d/01-schema.sql
healthcheck:
test: ["CMD-SHELL", "pg_isready -U meta_user -d meta_insights"]
interval: 10s
timeout: 5s
retries: 5
restart: unless-stopped
# Optional: Grafana for visualization
grafana:
image: grafana/grafana:latest
container_name: meta_grafana
ports:
- "3000:3000"
environment:
GF_SECURITY_ADMIN_USER: admin
GF_SECURITY_ADMIN_PASSWORD: admin
GF_INSTALL_PLUGINS: grafana-clock-panel
volumes:
- grafana_data:/var/lib/grafana
depends_on:
timescaledb:
condition: service_healthy
restart: unless-stopped
volumes:
timescale_data:
grafana_data:

View File

@@ -6,7 +6,10 @@ readme = "README.md"
requires-python = ">=3.13"
dependencies = [
"aiohttp>=3.13.1",
"alembic>=1.17.0",
"asyncpg>=0.30.0",
"facebook-business>=23.0.3",
"python-dotenv>=1.1.1",
"requests-oauthlib>=2.0.0",
"sqlalchemy[asyncio]>=2.0.44",
]

View File

@@ -0,0 +1,376 @@
"""
Database module for storing Meta ad insights in TimescaleDB.
Handles time-series data with metadata caching.
"""
import asyncio
import os
from datetime import datetime
from typing import Any, Dict, List, Optional
import asyncpg
from dotenv import load_dotenv
class TimescaleDBClient:
"""Async client for TimescaleDB operations with metadata caching."""
def __init__(self, connection_string: Optional[str] = None):
"""
Initialize the database client.
Args:
connection_string: PostgreSQL connection string.
If not provided, will load from DATABASE_URL env var.
"""
load_dotenv()
self.connection_string = connection_string or os.getenv("DATABASE_URL")
if not self.connection_string:
raise ValueError(
"Database connection string required. "
"Set DATABASE_URL in .env or pass connection_string parameter."
)
self.pool: Optional[asyncpg.Pool] = None
async def connect(self):
"""Create connection pool."""
if not self.pool:
self.pool = await asyncpg.create_pool(
self.connection_string,
min_size=2,
max_size=10,
command_timeout=60,
)
print("Connected to TimescaleDB")
async def close(self):
"""Close connection pool."""
if self.pool:
await self.pool.close()
self.pool = None
print("Disconnected from TimescaleDB")
async def __aenter__(self):
"""Context manager entry."""
await self.connect()
return self
async def __aexit__(self, exc_type, exc_val, exc_tb):
"""Context manager exit."""
await self.close()
# ========================================================================
# METADATA CACHING (Ad Accounts, Campaigns, Ad Sets)
# ========================================================================
async def upsert_ad_account(
self,
account_id: str,
account_name: Optional[str] = None,
currency: Optional[str] = None,
timezone_name: Optional[str] = None,
):
"""
Insert or update ad account metadata.
Args:
account_id: Meta ad account ID
account_name: Account name
currency: Account currency
timezone_name: Account timezone
"""
query = """
INSERT INTO ad_accounts (account_id, account_name, currency, timezone_name, updated_at)
VALUES ($1, $2, $3, $4, NOW())
ON CONFLICT (account_id)
DO UPDATE SET
account_name = COALESCE(EXCLUDED.account_name, ad_accounts.account_name),
currency = COALESCE(EXCLUDED.currency, ad_accounts.currency),
timezone_name = COALESCE(EXCLUDED.timezone_name, ad_accounts.timezone_name),
updated_at = NOW()
"""
async with self.pool.acquire() as conn:
await conn.execute(query, account_id, account_name, currency, timezone_name)
async def upsert_campaign(
self,
campaign_id: str,
account_id: str,
campaign_name: str,
status: Optional[str] = None,
objective: Optional[str] = None,
):
"""
Insert or update campaign metadata.
Args:
campaign_id: Meta campaign ID
account_id: Parent ad account ID
campaign_name: Campaign name
status: Campaign status
objective: Campaign objective
"""
query = """
INSERT INTO campaigns (campaign_id, account_id, campaign_name, status, objective, updated_at)
VALUES ($1, $2, $3, $4, $5, NOW())
ON CONFLICT (campaign_id)
DO UPDATE SET
campaign_name = EXCLUDED.campaign_name,
status = COALESCE(EXCLUDED.status, campaigns.status),
objective = COALESCE(EXCLUDED.objective, campaigns.objective),
updated_at = NOW()
"""
async with self.pool.acquire() as conn:
await conn.execute(query, campaign_id, account_id, campaign_name, status, objective)
async def upsert_adset(
self,
adset_id: str,
campaign_id: str,
adset_name: str,
status: Optional[str] = None,
):
"""
Insert or update ad set metadata.
Args:
adset_id: Meta ad set ID
campaign_id: Parent campaign ID
adset_name: Ad set name
status: Ad set status
"""
query = """
INSERT INTO adsets (adset_id, campaign_id, adset_name, status, updated_at)
VALUES ($1, $2, $3, $4, NOW())
ON CONFLICT (adset_id)
DO UPDATE SET
adset_name = EXCLUDED.adset_name,
status = COALESCE(EXCLUDED.status, adsets.status),
updated_at = NOW()
"""
async with self.pool.acquire() as conn:
await conn.execute(query, adset_id, campaign_id, adset_name, status)
# ========================================================================
# TIME-SERIES DATA INSERTION
# ========================================================================
async def insert_account_insights(
self,
time: datetime,
account_id: str,
data: Dict[str, Any],
date_preset: str = "today",
):
"""
Insert account-level insights data.
Args:
time: Timestamp for the data point
account_id: Ad account ID
data: Insights data dictionary from Meta API
date_preset: Date preset used (e.g., 'today', 'yesterday')
"""
query = """
INSERT INTO account_insights (
time, account_id, impressions, clicks, spend, reach, frequency,
ctr, cpc, cpm, cpp, actions, cost_per_action_type, date_preset, fetched_at
) VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, $13, $14, NOW())
ON CONFLICT (time, account_id)
DO UPDATE SET
impressions = EXCLUDED.impressions,
clicks = EXCLUDED.clicks,
spend = EXCLUDED.spend,
reach = EXCLUDED.reach,
frequency = EXCLUDED.frequency,
ctr = EXCLUDED.ctr,
cpc = EXCLUDED.cpc,
cpm = EXCLUDED.cpm,
cpp = EXCLUDED.cpp,
actions = EXCLUDED.actions,
cost_per_action_type = EXCLUDED.cost_per_action_type,
fetched_at = NOW()
"""
# Extract metrics with safe defaults
impressions = int(data.get("impressions", 0)) if data.get("impressions") else None
clicks = int(data.get("clicks", 0)) if data.get("clicks") else None
spend = float(data.get("spend", 0)) if data.get("spend") else None
reach = int(data.get("reach", 0)) if data.get("reach") else None
frequency = float(data.get("frequency", 0)) if data.get("frequency") else None
ctr = float(data.get("ctr", 0)) if data.get("ctr") else None
cpc = float(data.get("cpc", 0)) if data.get("cpc") else None
cpm = float(data.get("cpm", 0)) if data.get("cpm") else None
cpp = float(data.get("cpp", 0)) if data.get("cpp") else None
# Store actions as JSONB
import json
actions = json.dumps(data.get("actions", [])) if data.get("actions") else None
cost_per_action = json.dumps(data.get("cost_per_action_type", [])) if data.get("cost_per_action_type") else None
async with self.pool.acquire() as conn:
await conn.execute(
query,
time, account_id, impressions, clicks, spend, reach, frequency,
ctr, cpc, cpm, cpp, actions, cost_per_action, date_preset
)
async def insert_campaign_insights(
self,
time: datetime,
campaign_id: str,
account_id: str,
data: Dict[str, Any],
date_preset: str = "today",
):
"""
Insert campaign-level insights data.
Args:
time: Timestamp for the data point
campaign_id: Campaign ID
account_id: Ad account ID
data: Insights data dictionary from Meta API
date_preset: Date preset used
"""
query = """
INSERT INTO campaign_insights (
time, campaign_id, account_id, impressions, clicks, spend, reach,
ctr, cpc, cpm, actions, date_preset, fetched_at
) VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, NOW())
ON CONFLICT (time, campaign_id)
DO UPDATE SET
impressions = EXCLUDED.impressions,
clicks = EXCLUDED.clicks,
spend = EXCLUDED.spend,
reach = EXCLUDED.reach,
ctr = EXCLUDED.ctr,
cpc = EXCLUDED.cpc,
cpm = EXCLUDED.cpm,
actions = EXCLUDED.actions,
fetched_at = NOW()
"""
impressions = int(data.get("impressions", 0)) if data.get("impressions") else None
clicks = int(data.get("clicks", 0)) if data.get("clicks") else None
spend = float(data.get("spend", 0)) if data.get("spend") else None
reach = int(data.get("reach", 0)) if data.get("reach") else None
ctr = float(data.get("ctr", 0)) if data.get("ctr") else None
cpc = float(data.get("cpc", 0)) if data.get("cpc") else None
cpm = float(data.get("cpm", 0)) if data.get("cpm") else None
import json
actions = json.dumps(data.get("actions", [])) if data.get("actions") else None
async with self.pool.acquire() as conn:
await conn.execute(
query,
time, campaign_id, account_id, impressions, clicks, spend, reach,
ctr, cpc, cpm, actions, date_preset
)
async def insert_adset_insights(
self,
time: datetime,
adset_id: str,
campaign_id: str,
account_id: str,
data: Dict[str, Any],
date_preset: str = "today",
):
"""
Insert ad set level insights data.
Args:
time: Timestamp for the data point
adset_id: Ad set ID
campaign_id: Campaign ID
account_id: Ad account ID
data: Insights data dictionary from Meta API
date_preset: Date preset used
"""
query = """
INSERT INTO adset_insights (
time, adset_id, campaign_id, account_id, impressions, clicks, spend, reach,
ctr, cpc, cpm, actions, date_preset, fetched_at
) VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, $13, NOW())
ON CONFLICT (time, adset_id)
DO UPDATE SET
impressions = EXCLUDED.impressions,
clicks = EXCLUDED.clicks,
spend = EXCLUDED.spend,
reach = EXCLUDED.reach,
ctr = EXCLUDED.ctr,
cpc = EXCLUDED.cpc,
cpm = EXCLUDED.cpm,
actions = EXCLUDED.actions,
fetched_at = NOW()
"""
impressions = int(data.get("impressions", 0)) if data.get("impressions") else None
clicks = int(data.get("clicks", 0)) if data.get("clicks") else None
spend = float(data.get("spend", 0)) if data.get("spend") else None
reach = int(data.get("reach", 0)) if data.get("reach") else None
ctr = float(data.get("ctr", 0)) if data.get("ctr") else None
cpc = float(data.get("cpc", 0)) if data.get("cpc") else None
cpm = float(data.get("cpm", 0)) if data.get("cpm") else None
import json
actions = json.dumps(data.get("actions", [])) if data.get("actions") else None
async with self.pool.acquire() as conn:
await conn.execute(
query,
time, adset_id, campaign_id, account_id, impressions, clicks, spend, reach,
ctr, cpc, cpm, actions, date_preset
)
# ========================================================================
# QUERY HELPERS
# ========================================================================
async def get_latest_account_metrics(self, account_id: str) -> Optional[Dict[str, Any]]:
"""
Get the most recent metrics for an account.
Args:
account_id: Ad account ID
Returns:
Dictionary of latest metrics or None
"""
query = """
SELECT * FROM latest_account_metrics
WHERE account_id = $1
"""
async with self.pool.acquire() as conn:
row = await conn.fetchrow(query, account_id)
return dict(row) if row else None
async def get_account_metrics_range(
self,
account_id: str,
start_time: datetime,
end_time: datetime,
) -> List[Dict[str, Any]]:
"""
Get account metrics for a time range.
Args:
account_id: Ad account ID
start_time: Start of time range
end_time: End of time range
Returns:
List of metric dictionaries
"""
query = """
SELECT * FROM account_insights
WHERE account_id = $1 AND time BETWEEN $2 AND $3
ORDER BY time DESC
"""
async with self.pool.acquire() as conn:
rows = await conn.fetch(query, account_id, start_time, end_time)
return [dict(row) for row in rows]

View File

@@ -0,0 +1,281 @@
-- TimescaleDB Schema for Meta Ad Insights
-- This schema is optimized for time-series data collection and dashboard queries
-- Enable TimescaleDB extension (run as superuser)
-- CREATE EXTENSION IF NOT EXISTS timescaledb;
-- ============================================================================
-- METADATA TABLES (Regular PostgreSQL tables for caching)
-- ============================================================================
-- Ad Accounts (rarely changes, cached)
CREATE TABLE IF NOT EXISTS ad_accounts (
account_id VARCHAR(50) PRIMARY KEY,
account_name VARCHAR(255),
currency VARCHAR(10),
timezone_name VARCHAR(100),
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
updated_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);
-- Campaigns (metadata cache)
CREATE TABLE IF NOT EXISTS campaigns (
campaign_id VARCHAR(50) PRIMARY KEY,
account_id VARCHAR(50) REFERENCES ad_accounts(account_id),
campaign_name VARCHAR(255) NOT NULL,
status VARCHAR(50),
objective VARCHAR(100),
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
updated_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);
CREATE INDEX idx_campaigns_account ON campaigns(account_id);
-- Ad Sets (metadata cache)
CREATE TABLE IF NOT EXISTS adsets (
adset_id VARCHAR(50) PRIMARY KEY,
campaign_id VARCHAR(50) REFERENCES campaigns(campaign_id),
adset_name VARCHAR(255) NOT NULL,
status VARCHAR(50),
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
updated_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);
CREATE INDEX idx_adsets_campaign ON adsets(campaign_id);
-- ============================================================================
-- TIME-SERIES TABLES (Hypertables)
-- ============================================================================
-- Account-level insights (time-series data)
CREATE TABLE IF NOT EXISTS account_insights (
time TIMESTAMPTZ NOT NULL,
account_id VARCHAR(50) NOT NULL REFERENCES ad_accounts(account_id),
-- Core metrics
impressions BIGINT,
clicks BIGINT,
spend NUMERIC(12, 2),
reach BIGINT,
frequency NUMERIC(10, 4),
-- Calculated metrics
ctr NUMERIC(10, 6), -- Click-through rate
cpc NUMERIC(10, 4), -- Cost per click
cpm NUMERIC(10, 4), -- Cost per mille (thousand impressions)
cpp NUMERIC(10, 4), -- Cost per reach
-- Actions (stored as JSONB for flexibility)
actions JSONB,
cost_per_action_type JSONB,
-- Metadata
date_preset VARCHAR(50),
fetched_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
-- Composite primary key
PRIMARY KEY (time, account_id)
);
-- Convert to hypertable (partition by time)
SELECT create_hypertable('account_insights', 'time',
if_not_exists => TRUE,
chunk_time_interval => INTERVAL '1 day'
);
-- Create index for efficient queries
CREATE INDEX IF NOT EXISTS idx_account_insights_account_time
ON account_insights (account_id, time DESC);
-- Campaign-level insights (time-series data)
CREATE TABLE IF NOT EXISTS campaign_insights (
time TIMESTAMPTZ NOT NULL,
campaign_id VARCHAR(50) NOT NULL REFERENCES campaigns(campaign_id),
account_id VARCHAR(50) NOT NULL REFERENCES ad_accounts(account_id),
-- Core metrics
impressions BIGINT,
clicks BIGINT,
spend NUMERIC(12, 2),
reach BIGINT,
-- Calculated metrics
ctr NUMERIC(10, 6),
cpc NUMERIC(10, 4),
cpm NUMERIC(10, 4),
-- Actions
actions JSONB,
-- Metadata
date_preset VARCHAR(50),
fetched_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
PRIMARY KEY (time, campaign_id)
);
-- Convert to hypertable
SELECT create_hypertable('campaign_insights', 'time',
if_not_exists => TRUE,
chunk_time_interval => INTERVAL '1 day'
);
CREATE INDEX IF NOT EXISTS idx_campaign_insights_campaign_time
ON campaign_insights (campaign_id, time DESC);
CREATE INDEX IF NOT EXISTS idx_campaign_insights_account_time
ON campaign_insights (account_id, time DESC);
-- Ad Set level insights (time-series data)
CREATE TABLE IF NOT EXISTS adset_insights (
time TIMESTAMPTZ NOT NULL,
adset_id VARCHAR(50) NOT NULL REFERENCES adsets(adset_id),
campaign_id VARCHAR(50) NOT NULL REFERENCES campaigns(campaign_id),
account_id VARCHAR(50) NOT NULL REFERENCES ad_accounts(account_id),
-- Core metrics
impressions BIGINT,
clicks BIGINT,
spend NUMERIC(12, 2),
reach BIGINT,
-- Calculated metrics
ctr NUMERIC(10, 6),
cpc NUMERIC(10, 4),
cpm NUMERIC(10, 4),
-- Actions
actions JSONB,
-- Metadata
date_preset VARCHAR(50),
fetched_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
PRIMARY KEY (time, adset_id)
);
-- Convert to hypertable
SELECT create_hypertable('adset_insights', 'time',
if_not_exists => TRUE,
chunk_time_interval => INTERVAL '1 day'
);
CREATE INDEX IF NOT EXISTS idx_adset_insights_adset_time
ON adset_insights (adset_id, time DESC);
CREATE INDEX IF NOT EXISTS idx_adset_insights_campaign_time
ON adset_insights (campaign_id, time DESC);
CREATE INDEX IF NOT EXISTS idx_adset_insights_account_time
ON adset_insights (account_id, time DESC);
-- ============================================================================
-- CONTINUOUS AGGREGATES (Pre-computed rollups for dashboards)
-- ============================================================================
-- Hourly aggregates for account insights
CREATE MATERIALIZED VIEW IF NOT EXISTS account_insights_hourly
WITH (timescaledb.continuous) AS
SELECT
time_bucket('1 hour', time) AS bucket,
account_id,
AVG(impressions) as avg_impressions,
AVG(clicks) as avg_clicks,
AVG(spend) as avg_spend,
AVG(ctr) as avg_ctr,
AVG(cpc) as avg_cpc,
AVG(cpm) as avg_cpm,
MAX(reach) as max_reach,
COUNT(*) as sample_count
FROM account_insights
GROUP BY bucket, account_id;
-- Refresh policy: refresh last 2 days every hour
SELECT add_continuous_aggregate_policy('account_insights_hourly',
start_offset => INTERVAL '2 days',
end_offset => INTERVAL '1 hour',
schedule_interval => INTERVAL '1 hour',
if_not_exists => TRUE
);
-- Daily aggregates for account insights
CREATE MATERIALIZED VIEW IF NOT EXISTS account_insights_daily
WITH (timescaledb.continuous) AS
SELECT
time_bucket('1 day', time) AS bucket,
account_id,
AVG(impressions) as avg_impressions,
SUM(impressions) as total_impressions,
AVG(clicks) as avg_clicks,
SUM(clicks) as total_clicks,
AVG(spend) as avg_spend,
SUM(spend) as total_spend,
AVG(ctr) as avg_ctr,
AVG(cpc) as avg_cpc,
AVG(cpm) as avg_cpm,
MAX(reach) as max_reach,
COUNT(*) as sample_count
FROM account_insights
GROUP BY bucket, account_id;
-- Refresh policy: refresh last 7 days every 4 hours
SELECT add_continuous_aggregate_policy('account_insights_daily',
start_offset => INTERVAL '7 days',
end_offset => INTERVAL '1 hour',
schedule_interval => INTERVAL '4 hours',
if_not_exists => TRUE
);
-- ============================================================================
-- DATA RETENTION POLICIES (Optional - uncomment to enable)
-- ============================================================================
-- Keep raw data for 90 days, then drop
-- SELECT add_retention_policy('account_insights', INTERVAL '90 days', if_not_exists => TRUE);
-- SELECT add_retention_policy('campaign_insights', INTERVAL '90 days', if_not_exists => TRUE);
-- SELECT add_retention_policy('adset_insights', INTERVAL '90 days', if_not_exists => TRUE);
-- Compress data older than 7 days for better storage efficiency
SELECT add_compression_policy('account_insights', INTERVAL '7 days', if_not_exists => TRUE);
SELECT add_compression_policy('campaign_insights', INTERVAL '7 days', if_not_exists => TRUE);
SELECT add_compression_policy('adset_insights', INTERVAL '7 days', if_not_exists => TRUE);
-- ============================================================================
-- HELPER VIEWS FOR DASHBOARDS
-- ============================================================================
-- Latest metrics per account
CREATE OR REPLACE VIEW latest_account_metrics AS
SELECT DISTINCT ON (account_id)
account_id,
time,
impressions,
clicks,
spend,
ctr,
cpc,
cpm,
reach,
frequency
FROM account_insights
ORDER BY account_id, time DESC;
-- Campaign performance summary (last 24 hours)
CREATE OR REPLACE VIEW campaign_performance_24h AS
SELECT
c.campaign_id,
c.campaign_name,
c.account_id,
SUM(ci.impressions) as total_impressions,
SUM(ci.clicks) as total_clicks,
SUM(ci.spend) as total_spend,
AVG(ci.ctr) as avg_ctr,
AVG(ci.cpc) as avg_cpc
FROM campaigns c
LEFT JOIN campaign_insights ci ON c.campaign_id = ci.campaign_id
WHERE ci.time >= NOW() - INTERVAL '24 hours'
GROUP BY c.campaign_id, c.campaign_name, c.account_id
ORDER BY total_spend DESC;

View File

@@ -292,7 +292,7 @@ async def main():
grabber = MetaInsightsGrabber()
# Grab insights for the last 7 days
insights = await grabber.grab_all_insights(date_preset="last_7d")
insights = await grabber.grab_all_insights(date_preset=AdsInsights.DatePreset.last_7d)
# Save to JSON
filepath = await grabber.save_insights_to_json(insights)

View File

@@ -0,0 +1,424 @@
"""
Scheduled data grabber that collects Meta ad insights and stores in TimescaleDB.
Runs periodically to build time-series data for dashboards.
"""
import asyncio
import os
from datetime import datetime, timezone
from typing import Optional
from dotenv import load_dotenv
from facebook_business.adobjects.adaccount import AdAccount
from facebook_business.adobjects.adsinsights import AdsInsights
from facebook_business.api import FacebookAdsApi
from .database import TimescaleDBClient
class ScheduledInsightsGrabber:
"""
Scheduled grabber for Meta ad insights with TimescaleDB storage.
Optimized for periodic data collection:
- Uses 'today' date preset (recommended by Meta)
- Caches metadata (accounts, campaigns, adsets)
- Upserts time-series data
- Conservative rate limiting
"""
def __init__(self, access_token: Optional[str] = None):
"""
Initialize the scheduled grabber.
Args:
access_token: Optional access token. If not provided, loads from env.
"""
load_dotenv()
self.access_token = access_token or os.getenv("META_ACCESS_TOKEN")
self.app_secret = os.getenv("META_APP_SECRET")
self.app_id = os.getenv("META_APP_ID")
self.ad_account_id = os.getenv("META_AD_ACCOUNT_ID")
if not self.access_token:
raise ValueError(
"Access token required. Set META_ACCESS_TOKEN in .env or "
"run 'uv run python src/meta_api_grabber/auth.py'"
)
if not all([self.app_secret, self.app_id, self.ad_account_id]):
raise ValueError(
"Missing required environment variables (META_APP_SECRET, META_APP_ID, META_AD_ACCOUNT_ID)"
)
# Initialize Facebook Ads API
FacebookAdsApi.init(
app_id=self.app_id,
app_secret=self.app_secret,
access_token=self.access_token,
)
self.ad_account = AdAccount(self.ad_account_id)
# Database client
self.db: Optional[TimescaleDBClient] = None
# Conservative rate limiting
self.request_delay = 2.0 # 2 seconds between API requests
async def _rate_limited_request(self, func, *args, **kwargs):
"""Execute a request with rate limiting."""
await asyncio.sleep(self.request_delay)
loop = asyncio.get_event_loop()
return await loop.run_in_executor(None, lambda: func(*args, **kwargs))
async def cache_account_metadata(self):
"""
Cache ad account metadata to database.
This reduces API calls by storing rarely-changing account info.
"""
print("Caching account metadata...")
# Get account details
account_fields = ['name', 'currency', 'timezone_name']
account_data = await self._rate_limited_request(
self.ad_account.api_get,
fields=account_fields
)
# Store in database
await self.db.upsert_ad_account(
account_id=self.ad_account_id,
account_name=account_data.get('name'),
currency=account_data.get('currency'),
timezone_name=account_data.get('timezone_name'),
)
print(f" Account cached: {account_data.get('name')}")
async def cache_campaigns_metadata(self, limit: int = 100):
"""
Cache campaign metadata to database.
Args:
limit: Maximum number of campaigns to cache
"""
print("Caching campaign metadata...")
# Get campaigns
from facebook_business.adobjects.campaign import Campaign
campaigns = await self._rate_limited_request(
self.ad_account.get_campaigns,
fields=[
Campaign.Field.name,
Campaign.Field.status,
Campaign.Field.objective,
],
params={'limit': limit}
)
count = 0
for campaign in campaigns:
await self.db.upsert_campaign(
campaign_id=campaign['id'],
account_id=self.ad_account_id,
campaign_name=campaign.get('name', 'Unknown'),
status=campaign.get('status'),
objective=campaign.get('objective'),
)
count += 1
print(f" {count} campaigns cached")
async def cache_adsets_metadata(self, limit: int = 100):
"""
Cache ad set metadata to database.
Args:
limit: Maximum number of ad sets to cache
"""
print("Caching ad set metadata...")
# Get ad sets
from facebook_business.adobjects.adset import AdSet
adsets = await self._rate_limited_request(
self.ad_account.get_ad_sets,
fields=[
AdSet.Field.name,
AdSet.Field.campaign_id,
AdSet.Field.status,
],
params={'limit': limit}
)
count = 0
for adset in adsets:
await self.db.upsert_adset(
adset_id=adset['id'],
campaign_id=adset.get('campaign_id'),
adset_name=adset.get('name', 'Unknown'),
status=adset.get('status'),
)
count += 1
print(f" {count} ad sets cached")
async def grab_account_insights(self, date_preset: str = "today"):
"""
Grab and store account-level insights.
Args:
date_preset: Meta date preset (default: 'today')
"""
print(f"Grabbing account insights ({date_preset})...")
fields = [
AdsInsights.Field.impressions,
AdsInsights.Field.clicks,
AdsInsights.Field.spend,
AdsInsights.Field.cpc,
AdsInsights.Field.cpm,
AdsInsights.Field.ctr,
AdsInsights.Field.cpp,
AdsInsights.Field.reach,
AdsInsights.Field.frequency,
AdsInsights.Field.actions,
AdsInsights.Field.cost_per_action_type,
]
params = {
"date_preset": date_preset,
"level": "account",
}
insights = await self._rate_limited_request(
self.ad_account.get_insights,
fields=fields,
params=params,
)
# Store insights
timestamp = datetime.now(timezone.utc)
for insight in insights:
await self.db.insert_account_insights(
time=timestamp,
account_id=self.ad_account_id,
data=dict(insight),
date_preset=date_preset,
)
print(f" Account insights stored ({len(list(insights))} records)")
async def grab_campaign_insights(self, date_preset: str = "today", limit: int = 50):
"""
Grab and store campaign-level insights.
Args:
date_preset: Meta date preset
limit: Maximum number of campaigns
"""
print(f"Grabbing campaign insights ({date_preset}, limit={limit})...")
fields = [
AdsInsights.Field.campaign_id,
AdsInsights.Field.campaign_name,
AdsInsights.Field.impressions,
AdsInsights.Field.clicks,
AdsInsights.Field.spend,
AdsInsights.Field.ctr,
AdsInsights.Field.cpc,
AdsInsights.Field.cpm,
AdsInsights.Field.reach,
AdsInsights.Field.actions,
]
params = {
"date_preset": date_preset,
"level": "campaign",
"limit": limit,
}
insights = await self._rate_limited_request(
self.ad_account.get_insights,
fields=fields,
params=params,
)
# Store insights
timestamp = datetime.now(timezone.utc)
count = 0
for insight in insights:
campaign_id = insight.get('campaign_id')
if campaign_id:
await self.db.insert_campaign_insights(
time=timestamp,
campaign_id=campaign_id,
account_id=self.ad_account_id,
data=dict(insight),
date_preset=date_preset,
)
count += 1
print(f" Campaign insights stored ({count} records)")
async def grab_adset_insights(self, date_preset: str = "today", limit: int = 50):
"""
Grab and store ad set level insights.
Args:
date_preset: Meta date preset
limit: Maximum number of ad sets
"""
print(f"Grabbing ad set insights ({date_preset}, limit={limit})...")
fields = [
AdsInsights.Field.adset_id,
AdsInsights.Field.adset_name,
AdsInsights.Field.campaign_id,
AdsInsights.Field.impressions,
AdsInsights.Field.clicks,
AdsInsights.Field.spend,
AdsInsights.Field.ctr,
AdsInsights.Field.cpc,
AdsInsights.Field.cpm,
AdsInsights.Field.reach,
AdsInsights.Field.actions,
]
params = {
"date_preset": date_preset,
"level": "adset",
"limit": limit,
}
insights = await self._rate_limited_request(
self.ad_account.get_insights,
fields=fields,
params=params,
)
# Store insights
timestamp = datetime.now(timezone.utc)
count = 0
for insight in insights:
adset_id = insight.get('adset_id')
campaign_id = insight.get('campaign_id')
if adset_id and campaign_id:
await self.db.insert_adset_insights(
time=timestamp,
adset_id=adset_id,
campaign_id=campaign_id,
account_id=self.ad_account_id,
data=dict(insight),
date_preset=date_preset,
)
count += 1
print(f" Ad set insights stored ({count} records)")
async def run_collection_cycle(self, cache_metadata: bool = True):
"""
Run a single collection cycle.
Args:
cache_metadata: Whether to refresh metadata cache
"""
print("\n" + "="*60)
print(f"COLLECTION CYCLE - {datetime.now().isoformat()}")
print("="*60)
# Refresh metadata cache if requested (do this less frequently)
if cache_metadata:
await self.cache_account_metadata()
await self.cache_campaigns_metadata(limit=100)
await self.cache_adsets_metadata(limit=100)
# Grab insights (always use 'today' for scheduled collection)
await self.grab_account_insights(date_preset="today")
await self.grab_campaign_insights(date_preset="today", limit=50)
await self.grab_adset_insights(date_preset="today", limit=50)
print("\n" + "="*60)
print("COLLECTION CYCLE COMPLETE")
print("="*60 + "\n")
async def run_scheduled(
self,
interval_hours: float = 2.0,
refresh_metadata_every_n_cycles: int = 12,
):
"""
Run scheduled data collection.
Args:
interval_hours: Hours between collection cycles (default: 2)
refresh_metadata_every_n_cycles: Refresh metadata every N cycles (default: 12 = once per day if interval=2h)
"""
print("\n" + "="*60)
print("SCHEDULED INSIGHTS GRABBER STARTED")
print("="*60)
print(f"Account: {self.ad_account_id}")
print(f"Collection interval: {interval_hours} hours")
print(f"Metadata refresh: every {refresh_metadata_every_n_cycles} cycles")
print("="*60 + "\n")
# Connect to database
self.db = TimescaleDBClient()
await self.db.connect()
cycle_count = 0
try:
while True:
cycle_count += 1
# Refresh metadata periodically (e.g., once per day)
cache_metadata = (cycle_count % refresh_metadata_every_n_cycles == 1)
# Run collection
await self.run_collection_cycle(cache_metadata=cache_metadata)
# Wait for next cycle
wait_seconds = interval_hours * 3600
print(f"Waiting {interval_hours} hours until next cycle...")
print(f"Next collection: {datetime.now().astimezone().replace(microsecond=0) + timedelta(seconds=wait_seconds)}\n")
await asyncio.sleep(wait_seconds)
except KeyboardInterrupt:
print("\nShutting down gracefully...")
finally:
await self.db.close()
async def main():
"""Main entry point for scheduled grabber."""
try:
grabber = ScheduledInsightsGrabber()
# Run scheduled collection (every 2 hours)
await grabber.run_scheduled(
interval_hours=2.0,
refresh_metadata_every_n_cycles=12, # Refresh metadata twice per day
)
except ValueError as e:
print(f"Configuration error: {e}")
return 1
except Exception as e:
print(f"Error: {e}")
import traceback
traceback.print_exc()
return 1
return 0
if __name__ == "__main__":
from datetime import timedelta
exit_code = asyncio.run(main())
exit(exit_code)

159
uv.lock generated
View File

@@ -91,6 +91,36 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/fb/76/641ae371508676492379f16e2fa48f4e2c11741bd63c48be4b12a6b09cba/aiosignal-1.4.0-py3-none-any.whl", hash = "sha256:053243f8b92b990551949e63930a839ff0cf0b0ebbe0597b0f3fb19e1a0fe82e", size = 7490, upload-time = "2025-07-03T22:54:42.156Z" },
]
[[package]]
name = "alembic"
version = "1.17.0"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "mako" },
{ name = "sqlalchemy" },
{ name = "typing-extensions" },
]
sdist = { url = "https://files.pythonhosted.org/packages/6b/45/6f4555f2039f364c3ce31399529dcf48dd60726ff3715ad67f547d87dfd2/alembic-1.17.0.tar.gz", hash = "sha256:4652a0b3e19616b57d652b82bfa5e38bf5dbea0813eed971612671cb9e90c0fe", size = 1975526, upload-time = "2025-10-11T18:40:13.585Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/44/1f/38e29b06bfed7818ebba1f84904afdc8153ef7b6c7e0d8f3bc6643f5989c/alembic-1.17.0-py3-none-any.whl", hash = "sha256:80523bc437d41b35c5db7e525ad9d908f79de65c27d6a5a5eab6df348a352d99", size = 247449, upload-time = "2025-10-11T18:40:16.288Z" },
]
[[package]]
name = "asyncpg"
version = "0.30.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/2f/4c/7c991e080e106d854809030d8584e15b2e996e26f16aee6d757e387bc17d/asyncpg-0.30.0.tar.gz", hash = "sha256:c551e9928ab6707602f44811817f82ba3c446e018bfe1d3abecc8ba5f3eac851", size = 957746, upload-time = "2024-10-20T00:30:41.127Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/3a/22/e20602e1218dc07692acf70d5b902be820168d6282e69ef0d3cb920dc36f/asyncpg-0.30.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:05b185ebb8083c8568ea8a40e896d5f7af4b8554b64d7719c0eaa1eb5a5c3a70", size = 670373, upload-time = "2024-10-20T00:29:55.165Z" },
{ url = "https://files.pythonhosted.org/packages/3d/b3/0cf269a9d647852a95c06eb00b815d0b95a4eb4b55aa2d6ba680971733b9/asyncpg-0.30.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:c47806b1a8cbb0a0db896f4cd34d89942effe353a5035c62734ab13b9f938da3", size = 634745, upload-time = "2024-10-20T00:29:57.14Z" },
{ url = "https://files.pythonhosted.org/packages/8e/6d/a4f31bf358ce8491d2a31bfe0d7bcf25269e80481e49de4d8616c4295a34/asyncpg-0.30.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9b6fde867a74e8c76c71e2f64f80c64c0f3163e687f1763cfaf21633ec24ec33", size = 3512103, upload-time = "2024-10-20T00:29:58.499Z" },
{ url = "https://files.pythonhosted.org/packages/96/19/139227a6e67f407b9c386cb594d9628c6c78c9024f26df87c912fabd4368/asyncpg-0.30.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:46973045b567972128a27d40001124fbc821c87a6cade040cfcd4fa8a30bcdc4", size = 3592471, upload-time = "2024-10-20T00:30:00.354Z" },
{ url = "https://files.pythonhosted.org/packages/67/e4/ab3ca38f628f53f0fd28d3ff20edff1c975dd1cb22482e0061916b4b9a74/asyncpg-0.30.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:9110df111cabc2ed81aad2f35394a00cadf4f2e0635603db6ebbd0fc896f46a4", size = 3496253, upload-time = "2024-10-20T00:30:02.794Z" },
{ url = "https://files.pythonhosted.org/packages/ef/5f/0bf65511d4eeac3a1f41c54034a492515a707c6edbc642174ae79034d3ba/asyncpg-0.30.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:04ff0785ae7eed6cc138e73fc67b8e51d54ee7a3ce9b63666ce55a0bf095f7ba", size = 3662720, upload-time = "2024-10-20T00:30:04.501Z" },
{ url = "https://files.pythonhosted.org/packages/e7/31/1513d5a6412b98052c3ed9158d783b1e09d0910f51fbe0e05f56cc370bc4/asyncpg-0.30.0-cp313-cp313-win32.whl", hash = "sha256:ae374585f51c2b444510cdf3595b97ece4f233fde739aa14b50e0d64e8a7a590", size = 560404, upload-time = "2024-10-20T00:30:06.537Z" },
{ url = "https://files.pythonhosted.org/packages/c8/a4/cec76b3389c4c5ff66301cd100fe88c318563ec8a520e0b2e792b5b84972/asyncpg-0.30.0-cp313-cp313-win_amd64.whl", hash = "sha256:f59b430b8e27557c3fb9869222559f7417ced18688375825f8f12302c34e915e", size = 621623, upload-time = "2024-10-20T00:30:09.024Z" },
]
[[package]]
name = "attrs"
version = "25.4.0"
@@ -251,6 +281,30 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/9a/9a/e35b4a917281c0b8419d4207f4334c8e8c5dbf4f3f5f9ada73958d937dcc/frozenlist-1.8.0-py3-none-any.whl", hash = "sha256:0c18a16eab41e82c295618a77502e17b195883241c563b00f0aa5106fc4eaa0d", size = 13409, upload-time = "2025-10-06T05:38:16.721Z" },
]
[[package]]
name = "greenlet"
version = "3.2.4"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/03/b8/704d753a5a45507a7aab61f18db9509302ed3d0a27ac7e0359ec2905b1a6/greenlet-3.2.4.tar.gz", hash = "sha256:0dca0d95ff849f9a364385f36ab49f50065d76964944638be9691e1832e9f86d", size = 188260, upload-time = "2025-08-07T13:24:33.51Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/49/e8/58c7f85958bda41dafea50497cbd59738c5c43dbbea5ee83d651234398f4/greenlet-3.2.4-cp313-cp313-macosx_11_0_universal2.whl", hash = "sha256:1a921e542453fe531144e91e1feedf12e07351b1cf6c9e8a3325ea600a715a31", size = 272814, upload-time = "2025-08-07T13:15:50.011Z" },
{ url = "https://files.pythonhosted.org/packages/62/dd/b9f59862e9e257a16e4e610480cfffd29e3fae018a68c2332090b53aac3d/greenlet-3.2.4-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:cd3c8e693bff0fff6ba55f140bf390fa92c994083f838fece0f63be121334945", size = 641073, upload-time = "2025-08-07T13:42:57.23Z" },
{ url = "https://files.pythonhosted.org/packages/f7/0b/bc13f787394920b23073ca3b6c4a7a21396301ed75a655bcb47196b50e6e/greenlet-3.2.4-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:710638eb93b1fa52823aa91bf75326f9ecdfd5e0466f00789246a5280f4ba0fc", size = 655191, upload-time = "2025-08-07T13:45:29.752Z" },
{ url = "https://files.pythonhosted.org/packages/f2/d6/6adde57d1345a8d0f14d31e4ab9c23cfe8e2cd39c3baf7674b4b0338d266/greenlet-3.2.4-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:c5111ccdc9c88f423426df3fd1811bfc40ed66264d35aa373420a34377efc98a", size = 649516, upload-time = "2025-08-07T13:53:16.314Z" },
{ url = "https://files.pythonhosted.org/packages/7f/3b/3a3328a788d4a473889a2d403199932be55b1b0060f4ddd96ee7cdfcad10/greenlet-3.2.4-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:d76383238584e9711e20ebe14db6c88ddcedc1829a9ad31a584389463b5aa504", size = 652169, upload-time = "2025-08-07T13:18:32.861Z" },
{ url = "https://files.pythonhosted.org/packages/ee/43/3cecdc0349359e1a527cbf2e3e28e5f8f06d3343aaf82ca13437a9aa290f/greenlet-3.2.4-cp313-cp313-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:23768528f2911bcd7e475210822ffb5254ed10d71f4028387e5a99b4c6699671", size = 610497, upload-time = "2025-08-07T13:18:31.636Z" },
{ url = "https://files.pythonhosted.org/packages/b8/19/06b6cf5d604e2c382a6f31cafafd6f33d5dea706f4db7bdab184bad2b21d/greenlet-3.2.4-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:00fadb3fedccc447f517ee0d3fd8fe49eae949e1cd0f6a611818f4f6fb7dc83b", size = 1121662, upload-time = "2025-08-07T13:42:41.117Z" },
{ url = "https://files.pythonhosted.org/packages/a2/15/0d5e4e1a66fab130d98168fe984c509249c833c1a3c16806b90f253ce7b9/greenlet-3.2.4-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:d25c5091190f2dc0eaa3f950252122edbbadbb682aa7b1ef2f8af0f8c0afefae", size = 1149210, upload-time = "2025-08-07T13:18:24.072Z" },
{ url = "https://files.pythonhosted.org/packages/0b/55/2321e43595e6801e105fcfdee02b34c0f996eb71e6ddffca6b10b7e1d771/greenlet-3.2.4-cp313-cp313-win_amd64.whl", hash = "sha256:554b03b6e73aaabec3745364d6239e9e012d64c68ccd0b8430c64ccc14939a8b", size = 299685, upload-time = "2025-08-07T13:24:38.824Z" },
{ url = "https://files.pythonhosted.org/packages/22/5c/85273fd7cc388285632b0498dbbab97596e04b154933dfe0f3e68156c68c/greenlet-3.2.4-cp314-cp314-macosx_11_0_universal2.whl", hash = "sha256:49a30d5fda2507ae77be16479bdb62a660fa51b1eb4928b524975b3bde77b3c0", size = 273586, upload-time = "2025-08-07T13:16:08.004Z" },
{ url = "https://files.pythonhosted.org/packages/d1/75/10aeeaa3da9332c2e761e4c50d4c3556c21113ee3f0afa2cf5769946f7a3/greenlet-3.2.4-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:299fd615cd8fc86267b47597123e3f43ad79c9d8a22bebdce535e53550763e2f", size = 686346, upload-time = "2025-08-07T13:42:59.944Z" },
{ url = "https://files.pythonhosted.org/packages/c0/aa/687d6b12ffb505a4447567d1f3abea23bd20e73a5bed63871178e0831b7a/greenlet-3.2.4-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:c17b6b34111ea72fc5a4e4beec9711d2226285f0386ea83477cbb97c30a3f3a5", size = 699218, upload-time = "2025-08-07T13:45:30.969Z" },
{ url = "https://files.pythonhosted.org/packages/dc/8b/29aae55436521f1d6f8ff4e12fb676f3400de7fcf27fccd1d4d17fd8fecd/greenlet-3.2.4-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:b4a1870c51720687af7fa3e7cda6d08d801dae660f75a76f3845b642b4da6ee1", size = 694659, upload-time = "2025-08-07T13:53:17.759Z" },
{ url = "https://files.pythonhosted.org/packages/92/2e/ea25914b1ebfde93b6fc4ff46d6864564fba59024e928bdc7de475affc25/greenlet-3.2.4-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:061dc4cf2c34852b052a8620d40f36324554bc192be474b9e9770e8c042fd735", size = 695355, upload-time = "2025-08-07T13:18:34.517Z" },
{ url = "https://files.pythonhosted.org/packages/72/60/fc56c62046ec17f6b0d3060564562c64c862948c9d4bc8aa807cf5bd74f4/greenlet-3.2.4-cp314-cp314-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:44358b9bf66c8576a9f57a590d5f5d6e72fa4228b763d0e43fee6d3b06d3a337", size = 657512, upload-time = "2025-08-07T13:18:33.969Z" },
{ url = "https://files.pythonhosted.org/packages/e3/a5/6ddab2b4c112be95601c13428db1d8b6608a8b6039816f2ba09c346c08fc/greenlet-3.2.4-cp314-cp314-win_amd64.whl", hash = "sha256:e37ab26028f12dbb0ff65f29a8d3d44a765c61e729647bf2ddfbbed621726f01", size = 303425, upload-time = "2025-08-07T13:32:27.59Z" },
]
[[package]]
name = "idna"
version = "3.11"
@@ -260,23 +314,93 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/0e/61/66938bbb5fc52dbdf84594873d5b51fb1f7c7794e9c0f5bd885f30bc507b/idna-3.11-py3-none-any.whl", hash = "sha256:771a87f49d9defaf64091e6e6fe9c18d4833f140bd19464795bc32d966ca37ea", size = 71008, upload-time = "2025-10-12T14:55:18.883Z" },
]
[[package]]
name = "mako"
version = "1.3.10"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "markupsafe" },
]
sdist = { url = "https://files.pythonhosted.org/packages/9e/38/bd5b78a920a64d708fe6bc8e0a2c075e1389d53bef8413725c63ba041535/mako-1.3.10.tar.gz", hash = "sha256:99579a6f39583fa7e5630a28c3c1f440e4e97a414b80372649c0ce338da2ea28", size = 392474, upload-time = "2025-04-10T12:44:31.16Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/87/fb/99f81ac72ae23375f22b7afdb7642aba97c00a713c217124420147681a2f/mako-1.3.10-py3-none-any.whl", hash = "sha256:baef24a52fc4fc514a0887ac600f9f1cff3d82c61d4d700a1fa84d597b88db59", size = 78509, upload-time = "2025-04-10T12:50:53.297Z" },
]
[[package]]
name = "markupsafe"
version = "3.0.3"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/7e/99/7690b6d4034fffd95959cbe0c02de8deb3098cc577c67bb6a24fe5d7caa7/markupsafe-3.0.3.tar.gz", hash = "sha256:722695808f4b6457b320fdc131280796bdceb04ab50fe1795cd540799ebe1698", size = 80313, upload-time = "2025-09-27T18:37:40.426Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/38/2f/907b9c7bbba283e68f20259574b13d005c121a0fa4c175f9bed27c4597ff/markupsafe-3.0.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:e1cf1972137e83c5d4c136c43ced9ac51d0e124706ee1c8aa8532c1287fa8795", size = 11622, upload-time = "2025-09-27T18:36:41.777Z" },
{ url = "https://files.pythonhosted.org/packages/9c/d9/5f7756922cdd676869eca1c4e3c0cd0df60ed30199ffd775e319089cb3ed/markupsafe-3.0.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:116bb52f642a37c115f517494ea5feb03889e04df47eeff5b130b1808ce7c219", size = 12029, upload-time = "2025-09-27T18:36:43.257Z" },
{ url = "https://files.pythonhosted.org/packages/00/07/575a68c754943058c78f30db02ee03a64b3c638586fba6a6dd56830b30a3/markupsafe-3.0.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:133a43e73a802c5562be9bbcd03d090aa5a1fe899db609c29e8c8d815c5f6de6", size = 24374, upload-time = "2025-09-27T18:36:44.508Z" },
{ url = "https://files.pythonhosted.org/packages/a9/21/9b05698b46f218fc0e118e1f8168395c65c8a2c750ae2bab54fc4bd4e0e8/markupsafe-3.0.3-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ccfcd093f13f0f0b7fdd0f198b90053bf7b2f02a3927a30e63f3ccc9df56b676", size = 22980, upload-time = "2025-09-27T18:36:45.385Z" },
{ url = "https://files.pythonhosted.org/packages/7f/71/544260864f893f18b6827315b988c146b559391e6e7e8f7252839b1b846a/markupsafe-3.0.3-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:509fa21c6deb7a7a273d629cf5ec029bc209d1a51178615ddf718f5918992ab9", size = 21990, upload-time = "2025-09-27T18:36:46.916Z" },
{ url = "https://files.pythonhosted.org/packages/c2/28/b50fc2f74d1ad761af2f5dcce7492648b983d00a65b8c0e0cb457c82ebbe/markupsafe-3.0.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:a4afe79fb3de0b7097d81da19090f4df4f8d3a2b3adaa8764138aac2e44f3af1", size = 23784, upload-time = "2025-09-27T18:36:47.884Z" },
{ url = "https://files.pythonhosted.org/packages/ed/76/104b2aa106a208da8b17a2fb72e033a5a9d7073c68f7e508b94916ed47a9/markupsafe-3.0.3-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:795e7751525cae078558e679d646ae45574b47ed6e7771863fcc079a6171a0fc", size = 21588, upload-time = "2025-09-27T18:36:48.82Z" },
{ url = "https://files.pythonhosted.org/packages/b5/99/16a5eb2d140087ebd97180d95249b00a03aa87e29cc224056274f2e45fd6/markupsafe-3.0.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:8485f406a96febb5140bfeca44a73e3ce5116b2501ac54fe953e488fb1d03b12", size = 23041, upload-time = "2025-09-27T18:36:49.797Z" },
{ url = "https://files.pythonhosted.org/packages/19/bc/e7140ed90c5d61d77cea142eed9f9c303f4c4806f60a1044c13e3f1471d0/markupsafe-3.0.3-cp313-cp313-win32.whl", hash = "sha256:bdd37121970bfd8be76c5fb069c7751683bdf373db1ed6c010162b2a130248ed", size = 14543, upload-time = "2025-09-27T18:36:51.584Z" },
{ url = "https://files.pythonhosted.org/packages/05/73/c4abe620b841b6b791f2edc248f556900667a5a1cf023a6646967ae98335/markupsafe-3.0.3-cp313-cp313-win_amd64.whl", hash = "sha256:9a1abfdc021a164803f4d485104931fb8f8c1efd55bc6b748d2f5774e78b62c5", size = 15113, upload-time = "2025-09-27T18:36:52.537Z" },
{ url = "https://files.pythonhosted.org/packages/f0/3a/fa34a0f7cfef23cf9500d68cb7c32dd64ffd58a12b09225fb03dd37d5b80/markupsafe-3.0.3-cp313-cp313-win_arm64.whl", hash = "sha256:7e68f88e5b8799aa49c85cd116c932a1ac15caaa3f5db09087854d218359e485", size = 13911, upload-time = "2025-09-27T18:36:53.513Z" },
{ url = "https://files.pythonhosted.org/packages/e4/d7/e05cd7efe43a88a17a37b3ae96e79a19e846f3f456fe79c57ca61356ef01/markupsafe-3.0.3-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:218551f6df4868a8d527e3062d0fb968682fe92054e89978594c28e642c43a73", size = 11658, upload-time = "2025-09-27T18:36:54.819Z" },
{ url = "https://files.pythonhosted.org/packages/99/9e/e412117548182ce2148bdeacdda3bb494260c0b0184360fe0d56389b523b/markupsafe-3.0.3-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:3524b778fe5cfb3452a09d31e7b5adefeea8c5be1d43c4f810ba09f2ceb29d37", size = 12066, upload-time = "2025-09-27T18:36:55.714Z" },
{ url = "https://files.pythonhosted.org/packages/bc/e6/fa0ffcda717ef64a5108eaa7b4f5ed28d56122c9a6d70ab8b72f9f715c80/markupsafe-3.0.3-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:4e885a3d1efa2eadc93c894a21770e4bc67899e3543680313b09f139e149ab19", size = 25639, upload-time = "2025-09-27T18:36:56.908Z" },
{ url = "https://files.pythonhosted.org/packages/96/ec/2102e881fe9d25fc16cb4b25d5f5cde50970967ffa5dddafdb771237062d/markupsafe-3.0.3-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:8709b08f4a89aa7586de0aadc8da56180242ee0ada3999749b183aa23df95025", size = 23569, upload-time = "2025-09-27T18:36:57.913Z" },
{ url = "https://files.pythonhosted.org/packages/4b/30/6f2fce1f1f205fc9323255b216ca8a235b15860c34b6798f810f05828e32/markupsafe-3.0.3-cp313-cp313t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:b8512a91625c9b3da6f127803b166b629725e68af71f8184ae7e7d54686a56d6", size = 23284, upload-time = "2025-09-27T18:36:58.833Z" },
{ url = "https://files.pythonhosted.org/packages/58/47/4a0ccea4ab9f5dcb6f79c0236d954acb382202721e704223a8aafa38b5c8/markupsafe-3.0.3-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:9b79b7a16f7fedff2495d684f2b59b0457c3b493778c9eed31111be64d58279f", size = 24801, upload-time = "2025-09-27T18:36:59.739Z" },
{ url = "https://files.pythonhosted.org/packages/6a/70/3780e9b72180b6fecb83a4814d84c3bf4b4ae4bf0b19c27196104149734c/markupsafe-3.0.3-cp313-cp313t-musllinux_1_2_riscv64.whl", hash = "sha256:12c63dfb4a98206f045aa9563db46507995f7ef6d83b2f68eda65c307c6829eb", size = 22769, upload-time = "2025-09-27T18:37:00.719Z" },
{ url = "https://files.pythonhosted.org/packages/98/c5/c03c7f4125180fc215220c035beac6b9cb684bc7a067c84fc69414d315f5/markupsafe-3.0.3-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:8f71bc33915be5186016f675cd83a1e08523649b0e33efdb898db577ef5bb009", size = 23642, upload-time = "2025-09-27T18:37:01.673Z" },
{ url = "https://files.pythonhosted.org/packages/80/d6/2d1b89f6ca4bff1036499b1e29a1d02d282259f3681540e16563f27ebc23/markupsafe-3.0.3-cp313-cp313t-win32.whl", hash = "sha256:69c0b73548bc525c8cb9a251cddf1931d1db4d2258e9599c28c07ef3580ef354", size = 14612, upload-time = "2025-09-27T18:37:02.639Z" },
{ url = "https://files.pythonhosted.org/packages/2b/98/e48a4bfba0a0ffcf9925fe2d69240bfaa19c6f7507b8cd09c70684a53c1e/markupsafe-3.0.3-cp313-cp313t-win_amd64.whl", hash = "sha256:1b4b79e8ebf6b55351f0d91fe80f893b4743f104bff22e90697db1590e47a218", size = 15200, upload-time = "2025-09-27T18:37:03.582Z" },
{ url = "https://files.pythonhosted.org/packages/0e/72/e3cc540f351f316e9ed0f092757459afbc595824ca724cbc5a5d4263713f/markupsafe-3.0.3-cp313-cp313t-win_arm64.whl", hash = "sha256:ad2cf8aa28b8c020ab2fc8287b0f823d0a7d8630784c31e9ee5edea20f406287", size = 13973, upload-time = "2025-09-27T18:37:04.929Z" },
{ url = "https://files.pythonhosted.org/packages/33/8a/8e42d4838cd89b7dde187011e97fe6c3af66d8c044997d2183fbd6d31352/markupsafe-3.0.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:eaa9599de571d72e2daf60164784109f19978b327a3910d3e9de8c97b5b70cfe", size = 11619, upload-time = "2025-09-27T18:37:06.342Z" },
{ url = "https://files.pythonhosted.org/packages/b5/64/7660f8a4a8e53c924d0fa05dc3a55c9cee10bbd82b11c5afb27d44b096ce/markupsafe-3.0.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:c47a551199eb8eb2121d4f0f15ae0f923d31350ab9280078d1e5f12b249e0026", size = 12029, upload-time = "2025-09-27T18:37:07.213Z" },
{ url = "https://files.pythonhosted.org/packages/da/ef/e648bfd021127bef5fa12e1720ffed0c6cbb8310c8d9bea7266337ff06de/markupsafe-3.0.3-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:f34c41761022dd093b4b6896d4810782ffbabe30f2d443ff5f083e0cbbb8c737", size = 24408, upload-time = "2025-09-27T18:37:09.572Z" },
{ url = "https://files.pythonhosted.org/packages/41/3c/a36c2450754618e62008bf7435ccb0f88053e07592e6028a34776213d877/markupsafe-3.0.3-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:457a69a9577064c05a97c41f4e65148652db078a3a509039e64d3467b9e7ef97", size = 23005, upload-time = "2025-09-27T18:37:10.58Z" },
{ url = "https://files.pythonhosted.org/packages/bc/20/b7fdf89a8456b099837cd1dc21974632a02a999ec9bf7ca3e490aacd98e7/markupsafe-3.0.3-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:e8afc3f2ccfa24215f8cb28dcf43f0113ac3c37c2f0f0806d8c70e4228c5cf4d", size = 22048, upload-time = "2025-09-27T18:37:11.547Z" },
{ url = "https://files.pythonhosted.org/packages/9a/a7/591f592afdc734f47db08a75793a55d7fbcc6902a723ae4cfbab61010cc5/markupsafe-3.0.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:ec15a59cf5af7be74194f7ab02d0f59a62bdcf1a537677ce67a2537c9b87fcda", size = 23821, upload-time = "2025-09-27T18:37:12.48Z" },
{ url = "https://files.pythonhosted.org/packages/7d/33/45b24e4f44195b26521bc6f1a82197118f74df348556594bd2262bda1038/markupsafe-3.0.3-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:0eb9ff8191e8498cca014656ae6b8d61f39da5f95b488805da4bb029cccbfbaf", size = 21606, upload-time = "2025-09-27T18:37:13.485Z" },
{ url = "https://files.pythonhosted.org/packages/ff/0e/53dfaca23a69fbfbbf17a4b64072090e70717344c52eaaaa9c5ddff1e5f0/markupsafe-3.0.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:2713baf880df847f2bece4230d4d094280f4e67b1e813eec43b4c0e144a34ffe", size = 23043, upload-time = "2025-09-27T18:37:14.408Z" },
{ url = "https://files.pythonhosted.org/packages/46/11/f333a06fc16236d5238bfe74daccbca41459dcd8d1fa952e8fbd5dccfb70/markupsafe-3.0.3-cp314-cp314-win32.whl", hash = "sha256:729586769a26dbceff69f7a7dbbf59ab6572b99d94576a5592625d5b411576b9", size = 14747, upload-time = "2025-09-27T18:37:15.36Z" },
{ url = "https://files.pythonhosted.org/packages/28/52/182836104b33b444e400b14f797212f720cbc9ed6ba34c800639d154e821/markupsafe-3.0.3-cp314-cp314-win_amd64.whl", hash = "sha256:bdc919ead48f234740ad807933cdf545180bfbe9342c2bb451556db2ed958581", size = 15341, upload-time = "2025-09-27T18:37:16.496Z" },
{ url = "https://files.pythonhosted.org/packages/6f/18/acf23e91bd94fd7b3031558b1f013adfa21a8e407a3fdb32745538730382/markupsafe-3.0.3-cp314-cp314-win_arm64.whl", hash = "sha256:5a7d5dc5140555cf21a6fefbdbf8723f06fcd2f63ef108f2854de715e4422cb4", size = 14073, upload-time = "2025-09-27T18:37:17.476Z" },
{ url = "https://files.pythonhosted.org/packages/3c/f0/57689aa4076e1b43b15fdfa646b04653969d50cf30c32a102762be2485da/markupsafe-3.0.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:1353ef0c1b138e1907ae78e2f6c63ff67501122006b0f9abad68fda5f4ffc6ab", size = 11661, upload-time = "2025-09-27T18:37:18.453Z" },
{ url = "https://files.pythonhosted.org/packages/89/c3/2e67a7ca217c6912985ec766c6393b636fb0c2344443ff9d91404dc4c79f/markupsafe-3.0.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:1085e7fbddd3be5f89cc898938f42c0b3c711fdcb37d75221de2666af647c175", size = 12069, upload-time = "2025-09-27T18:37:19.332Z" },
{ url = "https://files.pythonhosted.org/packages/f0/00/be561dce4e6ca66b15276e184ce4b8aec61fe83662cce2f7d72bd3249d28/markupsafe-3.0.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1b52b4fb9df4eb9ae465f8d0c228a00624de2334f216f178a995ccdcf82c4634", size = 25670, upload-time = "2025-09-27T18:37:20.245Z" },
{ url = "https://files.pythonhosted.org/packages/50/09/c419f6f5a92e5fadde27efd190eca90f05e1261b10dbd8cbcb39cd8ea1dc/markupsafe-3.0.3-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:fed51ac40f757d41b7c48425901843666a6677e3e8eb0abcff09e4ba6e664f50", size = 23598, upload-time = "2025-09-27T18:37:21.177Z" },
{ url = "https://files.pythonhosted.org/packages/22/44/a0681611106e0b2921b3033fc19bc53323e0b50bc70cffdd19f7d679bb66/markupsafe-3.0.3-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:f190daf01f13c72eac4efd5c430a8de82489d9cff23c364c3ea822545032993e", size = 23261, upload-time = "2025-09-27T18:37:22.167Z" },
{ url = "https://files.pythonhosted.org/packages/5f/57/1b0b3f100259dc9fffe780cfb60d4be71375510e435efec3d116b6436d43/markupsafe-3.0.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:e56b7d45a839a697b5eb268c82a71bd8c7f6c94d6fd50c3d577fa39a9f1409f5", size = 24835, upload-time = "2025-09-27T18:37:23.296Z" },
{ url = "https://files.pythonhosted.org/packages/26/6a/4bf6d0c97c4920f1597cc14dd720705eca0bf7c787aebc6bb4d1bead5388/markupsafe-3.0.3-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:f3e98bb3798ead92273dc0e5fd0f31ade220f59a266ffd8a4f6065e0a3ce0523", size = 22733, upload-time = "2025-09-27T18:37:24.237Z" },
{ url = "https://files.pythonhosted.org/packages/14/c7/ca723101509b518797fedc2fdf79ba57f886b4aca8a7d31857ba3ee8281f/markupsafe-3.0.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:5678211cb9333a6468fb8d8be0305520aa073f50d17f089b5b4b477ea6e67fdc", size = 23672, upload-time = "2025-09-27T18:37:25.271Z" },
{ url = "https://files.pythonhosted.org/packages/fb/df/5bd7a48c256faecd1d36edc13133e51397e41b73bb77e1a69deab746ebac/markupsafe-3.0.3-cp314-cp314t-win32.whl", hash = "sha256:915c04ba3851909ce68ccc2b8e2cd691618c4dc4c4232fb7982bca3f41fd8c3d", size = 14819, upload-time = "2025-09-27T18:37:26.285Z" },
{ url = "https://files.pythonhosted.org/packages/1a/8a/0402ba61a2f16038b48b39bccca271134be00c5c9f0f623208399333c448/markupsafe-3.0.3-cp314-cp314t-win_amd64.whl", hash = "sha256:4faffd047e07c38848ce017e8725090413cd80cbc23d86e55c587bf979e579c9", size = 15426, upload-time = "2025-09-27T18:37:27.316Z" },
{ url = "https://files.pythonhosted.org/packages/70/bc/6f1c2f612465f5fa89b95bead1f44dcb607670fd42891d8fdcd5d039f4f4/markupsafe-3.0.3-cp314-cp314t-win_arm64.whl", hash = "sha256:32001d6a8fc98c8cb5c947787c5d08b0a50663d139f1305bac5885d98d9b40fa", size = 14146, upload-time = "2025-09-27T18:37:28.327Z" },
]
[[package]]
name = "meta-api-grabber"
version = "0.1.0"
source = { virtual = "." }
dependencies = [
{ name = "aiohttp" },
{ name = "alembic" },
{ name = "asyncpg" },
{ name = "facebook-business" },
{ name = "python-dotenv" },
{ name = "requests-oauthlib" },
{ name = "sqlalchemy", extra = ["asyncio"] },
]
[package.metadata]
requires-dist = [
{ name = "aiohttp", specifier = ">=3.13.1" },
{ name = "alembic", specifier = ">=1.17.0" },
{ name = "asyncpg", specifier = ">=0.30.0" },
{ name = "facebook-business", specifier = ">=23.0.3" },
{ name = "python-dotenv", specifier = ">=1.1.1" },
{ name = "requests-oauthlib", specifier = ">=2.0.0" },
{ name = "sqlalchemy", extras = ["asyncio"], specifier = ">=2.0.44" },
]
[[package]]
@@ -493,6 +617,41 @@ wheels = [
{ url = "https://files.pythonhosted.org/packages/b7/ce/149a00dd41f10bc29e5921b496af8b574d8413afcd5e30dfa0ed46c2cc5e/six-1.17.0-py2.py3-none-any.whl", hash = "sha256:4721f391ed90541fddacab5acf947aa0d3dc7d27b2e1e8eda2be8970586c3274", size = 11050, upload-time = "2024-12-04T17:35:26.475Z" },
]
[[package]]
name = "sqlalchemy"
version = "2.0.44"
source = { registry = "https://pypi.org/simple" }
dependencies = [
{ name = "greenlet", marker = "platform_machine == 'AMD64' or platform_machine == 'WIN32' or platform_machine == 'aarch64' or platform_machine == 'amd64' or platform_machine == 'ppc64le' or platform_machine == 'win32' or platform_machine == 'x86_64'" },
{ name = "typing-extensions" },
]
sdist = { url = "https://files.pythonhosted.org/packages/f0/f2/840d7b9496825333f532d2e3976b8eadbf52034178aac53630d09fe6e1ef/sqlalchemy-2.0.44.tar.gz", hash = "sha256:0ae7454e1ab1d780aee69fd2aae7d6b8670a581d8847f2d1e0f7ddfbf47e5a22", size = 9819830, upload-time = "2025-10-10T14:39:12.935Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/45/d3/c67077a2249fdb455246e6853166360054c331db4613cda3e31ab1cadbef/sqlalchemy-2.0.44-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:ff486e183d151e51b1d694c7aa1695747599bb00b9f5f604092b54b74c64a8e1", size = 2135479, upload-time = "2025-10-10T16:03:37.671Z" },
{ url = "https://files.pythonhosted.org/packages/2b/91/eabd0688330d6fd114f5f12c4f89b0d02929f525e6bf7ff80aa17ca802af/sqlalchemy-2.0.44-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:0b1af8392eb27b372ddb783b317dea0f650241cea5bd29199b22235299ca2e45", size = 2123212, upload-time = "2025-10-10T16:03:41.755Z" },
{ url = "https://files.pythonhosted.org/packages/b0/bb/43e246cfe0e81c018076a16036d9b548c4cc649de241fa27d8d9ca6f85ab/sqlalchemy-2.0.44-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2b61188657e3a2b9ac4e8f04d6cf8e51046e28175f79464c67f2fd35bceb0976", size = 3255353, upload-time = "2025-10-10T15:35:31.221Z" },
{ url = "https://files.pythonhosted.org/packages/b9/96/c6105ed9a880abe346b64d3b6ddef269ddfcab04f7f3d90a0bf3c5a88e82/sqlalchemy-2.0.44-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b87e7b91a5d5973dda5f00cd61ef72ad75a1db73a386b62877d4875a8840959c", size = 3260222, upload-time = "2025-10-10T15:43:50.124Z" },
{ url = "https://files.pythonhosted.org/packages/44/16/1857e35a47155b5ad927272fee81ae49d398959cb749edca6eaa399b582f/sqlalchemy-2.0.44-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:15f3326f7f0b2bfe406ee562e17f43f36e16167af99c4c0df61db668de20002d", size = 3189614, upload-time = "2025-10-10T15:35:32.578Z" },
{ url = "https://files.pythonhosted.org/packages/88/ee/4afb39a8ee4fc786e2d716c20ab87b5b1fb33d4ac4129a1aaa574ae8a585/sqlalchemy-2.0.44-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:1e77faf6ff919aa8cd63f1c4e561cac1d9a454a191bb864d5dd5e545935e5a40", size = 3226248, upload-time = "2025-10-10T15:43:51.862Z" },
{ url = "https://files.pythonhosted.org/packages/32/d5/0e66097fc64fa266f29a7963296b40a80d6a997b7ac13806183700676f86/sqlalchemy-2.0.44-cp313-cp313-win32.whl", hash = "sha256:ee51625c2d51f8baadf2829fae817ad0b66b140573939dd69284d2ba3553ae73", size = 2101275, upload-time = "2025-10-10T15:03:26.096Z" },
{ url = "https://files.pythonhosted.org/packages/03/51/665617fe4f8c6450f42a6d8d69243f9420f5677395572c2fe9d21b493b7b/sqlalchemy-2.0.44-cp313-cp313-win_amd64.whl", hash = "sha256:c1c80faaee1a6c3428cecf40d16a2365bcf56c424c92c2b6f0f9ad204b899e9e", size = 2127901, upload-time = "2025-10-10T15:03:27.548Z" },
{ url = "https://files.pythonhosted.org/packages/9c/5e/6a29fa884d9fb7ddadf6b69490a9d45fded3b38541713010dad16b77d015/sqlalchemy-2.0.44-py3-none-any.whl", hash = "sha256:19de7ca1246fbef9f9d1bff8f1ab25641569df226364a0e40457dc5457c54b05", size = 1928718, upload-time = "2025-10-10T15:29:45.32Z" },
]
[package.optional-dependencies]
asyncio = [
{ name = "greenlet" },
]
[[package]]
name = "typing-extensions"
version = "4.15.0"
source = { registry = "https://pypi.org/simple" }
sdist = { url = "https://files.pythonhosted.org/packages/72/94/1a15dd82efb362ac84269196e94cf00f187f7ed21c242792a923cdb1c61f/typing_extensions-4.15.0.tar.gz", hash = "sha256:0cea48d173cc12fa28ecabc3b837ea3cf6f38c6d1136f85cbaaf598984861466", size = 109391, upload-time = "2025-08-25T13:49:26.313Z" }
wheels = [
{ url = "https://files.pythonhosted.org/packages/18/67/36e9267722cc04a6b9f15c7f3441c2363321a3ea07da7ae0c0707beb2a9c/typing_extensions-4.15.0-py3-none-any.whl", hash = "sha256:f0fa19c6845758ab08074a0cfa8b7aecb71c999ca73d62883bc25cc018c4e548", size = 44614, upload-time = "2025-08-25T13:49:24.86Z" },
]
[[package]]
name = "urllib3"
version = "2.5.0"