Skip to main content

Eek, a [web] spider.

Project description

Eek is a web crawler that outputs metadata about a website in CSV format.

Installation

$ pip install eek

Usage

usage: eek [-h] [–graph] [–delay SECONDS] [–grep PATTERN] [-i] URL

eek recursively crawls a website, outputing metadata about each page in CSV format.

positional arguments:
  URL                The base URL to start the crawl

optional arguments:
  -h, --help         show this help message and exit
  --graph            output a graphviz digraph of links instead of CSV
                     metadata
  --delay SECONDS    Time, in seconds, to wait in between fetches. Defaults to
                     0.
  --grep PATTERN     Print urls containing PATTERN (a python regular
                     expression).
  -i, --ignore-case  Ignore case. Only valid with --grep

Example:

eek http://example.com/

To save output to a file, use redirection

eek http://example.com/ > ~/some_file.csv

To slow down crawling, use --delay=[seconds]

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

eek-1.0.2.tar.gz (7.4 kB view hashes)

Uploaded Source

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page