icrawler 0.6.10
pip install icrawler
Released:
A multi-thread crawler framework with many builtin image crawlers provided.
Navigation
Unverified details
These details have not been verified by PyPIProject links
Meta
Classifiers
- Development Status
- Intended Audience
- License
- Operating System
- Programming Language
- Topic
Project description
Introduction
Documentation: http://icrawler.readthedocs.io/
Try it with pip install icrawler or conda install -c hellock icrawler.
This package is a mini framework of web crawlers. With modularization design, it is easy to use and extend. It supports media data like images and videos very well, and can also be applied to texts and other type of files. Scrapy is heavy and powerful, while icrawler is tiny and flexible.
With this package, you can write a multiple thread crawler easily by focusing on the contents you want to crawl, keeping away from troublesome problems like exception handling, thread scheduling and communication.
It also provides built-in crawlers for popular image sites like Flickr and search engines such as Google, Bing and Baidu. (Thank all the contributors and pull requests are always welcome!)
Requirements
Python 3.5+ (recommended).
Examples
Using built-in crawlers is very simple. A minimal example is shown as follows.
from icrawler.builtin import GoogleImageCrawler
google_crawler = GoogleImageCrawler(storage={'root_dir': 'your_image_dir'})
google_crawler.crawl(keyword='cat', max_num=100)
You can also configurate number of threads and apply advanced search options. (Note: compatible with 0.6.0 and later versions)
from icrawler.builtin import GoogleImageCrawler
google_crawler = GoogleImageCrawler(
feeder_threads=1,
parser_threads=2,
downloader_threads=4,
storage={'root_dir': 'your_image_dir'})
filters = dict(
size='large',
color='orange',
license='commercial,modify',
date=((2017, 1, 1), (2017, 11, 30)))
google_crawler.crawl(keyword='cat', filters=filters, max_num=1000, file_idx_offset=0)
For more advanced usage about built-in crawlers, please refer to the documentation.
Writing your own crawlers with this framework is also convenient, see the tutorials.
Architecture
A crawler consists of 3 main components (Feeder, Parser and Downloader), they are connected with each other with FIFO queues. The workflow is shown in the following figure.
url_queue stores the url of pages which may contain images
task_queue stores the image url as well as any meta data you like, each element in the queue is a dictionary and must contain the field img_url
Feeder puts page urls to url_queue
Parser requests and parses the page, then extracts the image urls and puts them into task_queue
Downloader gets tasks from task_queue and requests the images, then saves them in the given path.
Feeder, parser and downloader are all thread pools, so you can specify the number of threads they use.
Project details
Unverified details
These details have not been verified by PyPIProject links
Meta
Classifiers
- Development Status
- Intended Audience
- License
- Operating System
- Programming Language
- Topic
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
File details
Details for the file icrawler-0.6.10.tar.gz
.
File metadata
- Download URL: icrawler-0.6.10.tar.gz
- Upload date:
- Size: 40.4 kB
- Tags: Source
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.0 CPython/3.10.14
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 45d2f47ab5f022cdfe73395175453eac2e8e8822659f6147ed3fb82146715727 |
|
MD5 | 7622ebc41a065e7fe0697048c2f03991 |
|
BLAKE2b-256 | d5443b1b91ec67f50000363d95871f1fc24a84c39c221c060b21db2a83f92fb3 |
File details
Details for the file icrawler-0.6.10-py3-none-any.whl
.
File metadata
- Download URL: icrawler-0.6.10-py3-none-any.whl
- Upload date:
- Size: 36.2 kB
- Tags: Python 3
- Uploaded using Trusted Publishing? No
- Uploaded via: twine/5.1.0 CPython/3.10.14
File hashes
Algorithm | Hash digest | |
---|---|---|
SHA256 | 159883cb06dea3c6b665b35045dcbea9922e6532d0b3d7eaee3029a2c3864940 |
|
MD5 | 53b25112934652230c2fc53372075fc3 |
|
BLAKE2b-256 | c1141d68f9d2b01955f4c4c63d378e0a331497055b4b96ec1d3a175222411544 |