skip to navigation
skip to content

qds_sdk 1.2.0

Python SDK for coding to the Qubole Data Service API

Latest Version: 1.9.8

A Python module that provides the tools you need to authenticate with, and use the Qubole Data Service API.


Run the following command (may need to do this as root):

$ python install

This should place a command line utility somewhere in your path

$ which

Alternate Virtualenv Installation

Alternatively, if you use virtualenv, you can do this:

$ cd qds-sdk-py
$ virtualenv venv
$ source venv/bin/activate
$ python install

Which will install within your virtualenv.

CLI allows running Hive, Hadoop, Pig, Presto and Shell commands against QDS. Users can run commands synchronously - or submit a command and check its status.

$ -h  # will print detailed usage


  1. run a hive query and print the results

    $ --token 'xxyyzz' hivecmd run --query "show tables"
    $ --token 'xxyyzz' hivecmd run --script_location /tmp/myquery
    $ --token 'xxyyzz' hivecmd run --script_location s3://my-qubole-location/myquery
  2. pass in api token from bash environment variable

    $ export QDS_API_TOKEN=xxyyzz
  3. run the example hadoop command

    $ hadoopcmd run streaming -files 's3n://paid-qubole/HadoopAPIExamples/WordCountPython/,s3n://paid-qubole/HadoopAPIExamples/WordCountPython/' -mapper -reducer -numReduceTasks 1 -input 's3n://paid-qubole/default-datasets/gutenberg' -output 's3n://'
  4. check the status of command # 12345678

    $ hivecmd check 12345678
    {"status": "done", ... }


An example Python application needs to do the following:

  1. Set the api_token:

    from qds_sdk.qubole import Qubole
  2. Use the Command classes defined in to execute commands. To run Hive Command:

    from qds_sdk.commands import *
    hc=HiveCommand.create(query='show tables')
    print "Id: %s, Status: %s" % (str(, hc.status)

example/ contains a Hadoop Streaming example

File Type Py Version Uploaded on Size
qds_sdk-1.2.0.tar.gz (md5) Source 2014-06-25 24KB