skip to navigation
skip to content

Not Logged In

shelljob 0.3.5

Run multiple subprocesses asynchronous/in parallel with streamed output/non-blocking reading. Also various tools to replace shell scripts.

Package Documentation

Latest Version: 0.4.3

shelljob

A simple way to manage several parallel subprocesses. This provides for asynchronous processes and non-blocking reading of their output.

Parallel Subprocesses

Using the Job system is the quickest approach to just run processes and log their output (by default in files named '/tmp/job_ID.log')

from shelljob import job

jm = job.FileMonitor()
jm.run([
    [ 'ls', '-alR', '/usr/local' ],
    'my_prog',
    'build output input',
])

An array will passed directly to subprocess.Popen, a string is first parsed with shlex.split.

The lower level Group class provides a simple container for more manual job management.

from shelljob import proc

g = proc.Group()
p1 = g.run( [ 'ls', '-al', '/usr/local' ] )
p2 = g.run( [ 'program', 'arg1', 'arg2' ] )

while g.is_pending():
    lines = g.readlines()
    for proc, line in lines:
        sys.stdout.write( "{}:{}".format( proc.pid, line ) )

Simple Subprocess calls

A simplified call function allows timeouts on subprocesses and easy acces to their output.

from shelljob import proc

# capture the output
output = proc.call( 'ls /tmp' )
# this raises a proc.Timeout exception
proc.call( 'sleep 10', timeout = 0.1 )

Find

The 'find' funtion is a multi-faceted approach to generating listings of files.

from shelljob import fs

files = fs.find( '/usr/local', name_regex = '.*\\.so' )
print( "\n".join(files) )

Refer to the API docs for all parameters. Just let me know if there is some additional option you need.

Issues

You can use my Launchpad project to submit issues.

 
File Type Py Version Uploaded on Size
shelljob-0.3.5.tar.gz (md5) Source 2013-12-10 48KB
  • Downloads (All Versions):
  • 13 downloads in the last day
  • 245 downloads in the last week
  • 3084 downloads in the last month