BigQuery-DatasetManager is a simple file-based CLI management tool for BigQuery Datasets.
Project description
BigQuery-DatasetManager
BigQuery-DatasetManager is a simple file-based CLI management tool for BigQuery Datasets.
Requirements
Python
CPython 2,7, 3,4, 3.5, 3.6
Installation
$ pip install BigQuery-DatasetManager
Resource representation
The resource representation of the dataset is described in YAML format.
name: billing
friendly_name: null
description: null
default_table_expiration_ms: null
location: US
access_grants:
- role: OWNER
entity_type: specialGroup
entity_id: projectOwners
- role: OWNER
entity_type: userByEmail
entity_id: billing-export-bigquery@system.gserviceaccount.com
- role: null
entity_type: view
entity_id:
datasetId: view
projectId: your-project-id
tableId: billing_view
See the official documentation of BigQuery Datasets for details of key names.
Key name |
Value |
Description |
||
---|---|---|---|---|
name |
str |
The name of the dataset. |
||
friendly_name |
str |
Title of the dataset. |
||
description |
str |
Description of the dataset. |
||
default_table_expiration_ms |
int |
Default expiration time for tables in the dataset. |
||
location |
str |
Location in which the dataset is hosted. |
||
access_grants |
map |
Roles granted to entities for this dataset. |
||
access_grants |
role |
str |
Role granted to the entity. One of
May also be None if the entity_type is view. |
|
entity_type |
str |
Type of entity being granted the role. One of
|
||
entity_id |
str/map |
ID of entity being granted the role. |
||
datasetId |
str |
The ID of the dataset containing this table. (Specified when entity_type is view.) |
||
projectId |
str |
The ID of the project containing this table. (Specified when entity_type is view.) |
||
tableId |
str |
The ID of the table. (Specified when entity_type is view.) |
Usage
Usage: bqdm [OPTIONS] COMMAND [ARGS]...
Options:
-c, --credential_file PATH Location of credential file for service accounts.
--debug Debug output management.
-h, --help Show this message and exit.
Commands:
apply Builds or changes datasets.
destroy Specify subcommand `plan` or `apply`.
export Export existing datasets into file in YAML format.
plan Generate and show an execution plan.
Export
Usage: bqdm export [OPTIONS]
Export existing datasets into file in YAML format.
Options:
-o, --output_dir TEXT Directory Path to output YAML files. [required]
-h, --help Show this message and exit.
Plan
Usage: bqdm plan [OPTIONS]
Generate and show an execution plan.
Options:
-d, --conf_dir DIRECTORY Directory path where YAML files located. [required]
--detailed_exitcode Return a detailed exit code when the command exits. When provided,
this argument changes
the exit codes and their meanings to provide
more granular information about what the
resulting plan contains:
0 = Succeeded with empty diff
1 = Error
2 = Succeeded with non-empty diff
-h, --help Show this message and exit.
Apply
Usage: bqdm apply [OPTIONS]
Builds or changes datasets.
Options:
-d, --conf_dir DIRECTORY Directory path where YAML files located. [required]
-h, --help Show this message and exit.
Destroy
Usage: bqdm destroy [OPTIONS] COMMAND [ARGS]...
Specify subcommand `plan` or `apply`
Options:
-h, --help Show this message and exit.
Commands:
apply Destroy managed datasets.
plan Generate and show an execution plan for datasets destruction.
Destroy plan
Usage: bqdm destroy plan [OPTIONS]
Generate and show an execution plan for datasets destruction.
Options:
-d, --conf_dir DIRECTORY Directory path where YAML files located. [required]
--detailed_exitcode Return a detailed exit code when the command exits. When provided,
this argument changes
the exit codes and their meanings to provide
more granular information about what the
resulting plan contains:
0 = Succeeded with empty diff
1 = Error
2 = Succeeded with non-empty diff
-h, --help Show this message and exit.
Destroy apply
Usage: bqdm destroy apply [OPTIONS]
Destroy managed datasets.
Options:
-d, --conf_dir DIRECTORY Directory path where YAML files located. [required]
-h, --help Show this message and exit.
Authentication
See authentication section in the official documentation of google-cloud-python.
If you’re running in Compute Engine or App Engine, authentication should “just work”.
If you’re developing locally, the easiest way to authenticate is using the Google Cloud SDK:
$ gcloud auth application-default login
Note that this command generates credentials for client libraries. To authenticate the CLI itself, use:
$ gcloud auth login
Previously, gcloud auth login was used for both use cases. If your gcloud installation does not support the new command, please update it:
$ gcloud components update
If you’re running your application elsewhere, you should download a service account JSON keyfile and point to it using an environment variable:
$ export GOOGLE_APPLICATION_CREDENTIALS="/path/to/keyfile.json"
Testing
Run test
$ py.test
Run test multiple Python versions
$ pip install tox
$ pyenv local 2.7.13 3.4.6 3.5.3 3.6.1
$ tox
TODO
Support labels field (Currently google-cloud-bigquery does not support labels field of Datasets)
Manage table resources
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for BigQuery-DatasetManager-0.0.1.tar.gz
Algorithm | Hash digest | |
---|---|---|
SHA256 | 71754b9d6d34e65bbaa50e2f21a6c511827d947c6fc5e28e6340ad755b229e22 |
|
MD5 | 06fcc1e9316689bafdb265b76861ce37 |
|
BLAKE2b-256 | e9af486c21941b625bcd6fd777131494de843a57511aa723b8dc1c359668a6bf |
Hashes for BigQuery_DatasetManager-0.0.1-py2.py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | 19f3a54f211a57be6ca1f200b4ba9e9b31ea152173bdaa4f5177814ad4f1d02c |
|
MD5 | c7cdd291654512c5495eed58d57c95da |
|
BLAKE2b-256 | f58aa9b311f887a2291c09c3ff035085297c09897eddff1c590363e71fc930c7 |