skip to navigation
skip to content

asynctools 0.1.2-1

Async tools for Python

Latest Version: 0.1.2-2


Async Tools for Python.

Table of Contents

  • Threading
    • Async
    • Parallel
    • Pool


Threading is the most simple thing, but because of GIL it’s useless for computation. Only use when you want to parallelize the access to a blocking resource, e.g. network.


Source: asynctools/threading/

Decorator for functions that should be run in a separate thread. When the function is called, it returns a `threading.Event <>`__.

from asynctools.threading import Async

def request(url):
    # ... do request

request('')  # Async request
request('').wait()  # wait for it to complete

If you want to wait for multiple threads to complete, see next chapters.


Source: asynctools/threading/

Execute functions in parallel and collect results. Each function is executed in its own thread, all threads exit immediately.


  • __call__(*args, **kwargs): Add a job. Call the Parallel object so it calls the worker function with the same arguments
  • map(jobs): Convenience method to call the worker for every argument
  • first(timeout=None): Wait for a single result to be available, with an optional timeout in seconds. The result is returned as soon as it’s ready. If all threads fail with an error – None is returned.
  • join(): Wait for all tasks to be finished, and return two lists:
    • A list of results
    • A list of exceptions


from asynctools.threading import Parallel

def request(url):
    # ... do request
    return data

# Execute
pll = Parallel(request)
for url in links:
    pll(url)  # Starts a new thread

# Wait for the results
results, errors = pll.join()

Since the request method takes just one argument, this can be chained:

results, errors = Parallel(request).map(links).join()


Source: asynctools/threading/

Create a pool of threads and execute work in it. Useful if you do want to launch a limited number of long-living threads.

Methods are same with `Parallel <#parallel>`__, with some additions:

  • __call__(*args, **kwargs)
  • map(jobs)
  • first(timeout=None)
  • close(): Terminate all threads. The pool is no more usable when closed.
  • __enter__, __exit__ context manager to be used with with statement


from asynctools.threading import Pool

def request(url):
    # ... do long request
    return data

# Make pool
pool = Pool(request, 5)

# Assign some job
for url in links:
    pll(url)  # Runs in a pool

# Wait for the results
results, errors = pll.join()
File Type Py Version Uploaded on Size
asynctools-0.1.2-1.linux-x86_64.tar.gz (md5)
built for Linux-3.13.0-30-generic-x86_64-with-glibc2.4
"dumb" binary any 2014-07-18 6KB
asynctools-0.1.2-1.tar.gz (md5) Source 2014-07-18 6KB