skip to navigation
skip to content

Not Logged In

asynctools 0.1.2-1

Async tools for Python

Latest Version: 0.1.2-2

AsyncTools

Async Tools for Python.

Table of Contents

  • Threading
    • Async
    • Parallel
    • Pool

Threading

Threading is the most simple thing, but because of GIL it’s useless for computation. Only use when you want to parallelize the access to a blocking resource, e.g. network.

Async

Source: asynctools/threading/Async.py

Decorator for functions that should be run in a separate thread. When the function is called, it returns a `threading.Event <https://docs.python.org/2/library/threading.html#event-objects>`__.

from asynctools.threading import Async

@Async
def request(url):
    # ... do request

request('http://example.com')  # Async request
request('http://example.com').wait()  # wait for it to complete

If you want to wait for multiple threads to complete, see next chapters.

Parallel

Source: asynctools/threading/Parallel.py

Execute functions in parallel and collect results. Each function is executed in its own thread, all threads exit immediately.

Methods:

  • __call__(*args, **kwargs): Add a job. Call the Parallel object so it calls the worker function with the same arguments
  • map(jobs): Convenience method to call the worker for every argument
  • first(timeout=None): Wait for a single result to be available, with an optional timeout in seconds. The result is returned as soon as it’s ready. If all threads fail with an error – None is returned.
  • join(): Wait for all tasks to be finished, and return two lists:
    • A list of results
    • A list of exceptions

Example:

from asynctools.threading import Parallel

def request(url):
    # ... do request
    return data

# Execute
pll = Parallel(request)
for url in links:
    pll(url)  # Starts a new thread


# Wait for the results
results, errors = pll.join()

Since the request method takes just one argument, this can be chained:

results, errors = Parallel(request).map(links).join()

Pool

Source: asynctools/threading/Pool.py

Create a pool of threads and execute work in it. Useful if you do want to launch a limited number of long-living threads.

Methods are same with `Parallel <#parallel>`__, with some additions:

  • __call__(*args, **kwargs)
  • map(jobs)
  • first(timeout=None)
  • close(): Terminate all threads. The pool is no more usable when closed.
  • __enter__, __exit__ context manager to be used with with statement

Example:

from asynctools.threading import Pool

def request(url):
    # ... do long request
    return data

# Make pool
pool = Pool(request, 5)

# Assign some job
for url in links:
    pll(url)  # Runs in a pool

# Wait for the results
results, errors = pll.join()
 
File Type Py Version Uploaded on Size
asynctools-0.1.2-1.linux-x86_64.tar.gz (md5)
built for Linux-3.13.0-30-generic-x86_64-with-glibc2.4
"dumb" binary any 2014-07-18 6KB
asynctools-0.1.2-1.tar.gz (md5) Source 2014-07-18 6KB
  • Downloads (All Versions):
  • 20 downloads in the last day
  • 187 downloads in the last week
  • 752 downloads in the last month