adbc-poolhouse¶
adbc-poolhouse creates a SQLAlchemy QueuePool from a typed warehouse config. One config in, one pool out — no boilerplate around driver detection or connection string assembly.
Installation¶
Or with uv:
ADBC drivers¶
adbc-poolhouse manages the pool, not the driver. You also need an ADBC driver for your target warehouse. Install the matching extra with adbc-poolhouse:
| Warehouse | Install command |
|---|---|
| PyPI drivers | |
| Apache Arrow Flight SQL | pip install adbc-poolhouse[flightsql] |
| BigQuery | pip install adbc-poolhouse[bigquery] |
| DuckDB | pip install adbc-poolhouse[duckdb] |
| PostgreSQL | pip install adbc-poolhouse[postgresql] |
| Snowflake | pip install adbc-poolhouse[snowflake] |
| SQLite | pip install adbc-poolhouse[sqlite] |
| Foundry-distributed drivers | |
| ClickHouse | Foundry-distributed — see Foundry installation |
| Databricks | Foundry-distributed — see Foundry installation |
| MSSQL / Azure SQL / Fabric | Foundry-distributed — see Foundry installation |
| MySQL | Foundry-distributed — see Foundry installation |
| Redshift | Foundry-distributed — see Foundry installation |
| Trino | Foundry-distributed — see Foundry installation |
First pool in five minutes¶
All supported warehouses have a typed config class.
PyPI-installed: BigQueryConfig, DuckDBConfig, FlightSQLConfig, PostgreSQLConfig, SnowflakeConfig, SQLiteConfig.
Foundry-distributed: ClickHouseConfig, DatabricksConfig, MSSQLConfig, MySQLConfig, RedshiftConfig, TrinoConfig.
The example below uses DuckDB — no credentials or running server required.
from adbc_poolhouse import DuckDBConfig, create_pool, close_pool
# File-backed database (connections share the same file)
config = DuckDBConfig(database="/tmp/warehouse.db")
pool = create_pool(config)
with pool.connect() as conn:
cursor = conn.cursor()
cursor.execute("SELECT 42 AS answer")
row = cursor.fetchone()
print(row) # (42,)
close_pool(pool)
pool.connect() checks out a connection from the pool and returns it when the with block exits. close_pool(pool) drains the pool and closes the underlying ADBC source connection.
What's next¶
- Pool lifecycle — how to dispose correctly, pytest fixture patterns, and common mistakes
- Consumer patterns — wiring a pool into FastAPI and reading credentials from a dbt profiles file
- Configuration reference — environment variable prefixes, pool tuning, and secret handling
- Snowflake guide — supported auth methods and private key variants
- Warehouse guides — per-warehouse install commands, auth examples, and env var prefixes
See also¶
- API Reference — auto-generated from source
- Changelog