Skip to main content

A collection of Python utility functions for ingesting data into SQLAlchemy-defined PostgreSQL tables, automatically migrating them as needed, and minimising locking

Project description

pg-bulk-ingest

PyPI package Test suite Code coverage

A Python utility function for ingesting data into SQLAlchemy-defined PostgreSQL tables, automatically migrating them as needed, allowing concurrent reads as much as possible.

Allowing concurrent writes is not an aim of pg-bulk-ingest. It is designed for use in ETL pipelines where PostgreSQL is used as a data warehouse, and the only writes to the table are from pg-bulk-ingest. It is assumed that there is only one pg-bulk-ingest running against a given table at any one time.

Features

pg-bulk-ingest exposes a single function as its API that:

  • Creates the tables if necessary
  • Migrates any existing tables if necessary, minimising locking
  • Ingests data in batches, where each batch is ingested in its own transaction
  • Handles "high-watermarking" to carry on from where a previous ingest finished or errored
  • Optionally performs an "upsert", matching rows on primary key
  • Optionally deletes all existing rows before ingestion
  • Optionally calls a callback just before each batch is visible to other database clients

Visit the pg-bulk-ingest documentation for usage instructions.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

pg_bulk_ingest-0.0.51.tar.gz (8.8 kB view hashes)

Uploaded Source

Built Distribution

pg_bulk_ingest-0.0.51-py3-none-any.whl (8.0 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page