Skip to main content

Use a random User-Agent provided by fake-useragent for every request

Project description

https://travis-ci.org/alecxe/scrapy-fake-useragent.svg?branch=master https://codecov.io/gh/alecxe/scrapy-fake-useragent/branch/master/graph/badge.svg PyPI version PyPI version Requirements Status

scrapy-fake-useragent

Random User-Agent middleware based on fake-useragent. It picks up User-Agent strings based on usage statistics from a real world database.

Installation

The simplest way is to install it via pip:

pip install scrapy-fake-useragent

Configuration

Turn off the built-in UserAgentMiddleware and RetryMiddleware and add RandomUserAgentMiddleware and RetryUserAgentMiddleware.

In Scrapy >=1.0:

DOWNLOADER_MIDDLEWARES = {
    'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware': None,
    'scrapy.downloadermiddlewares.retry.RetryMiddleware': None,
    'scrapy_fake_useragent.middleware.RandomUserAgentMiddleware': 400,
    'scrapy_fake_useragent.middleware.RetryUserAgentMiddleware': 401,
}

In Scrapy <1.0:

DOWNLOADER_MIDDLEWARES = {
    'scrapy.contrib.downloadermiddleware.useragent.UserAgentMiddleware': None,
    'scrapy.contrib.downloadermiddleware.retry.RetryMiddleware': None,
    'scrapy_fake_useragent.middleware.RandomUserAgentMiddleware': 400,
    'scrapy_fake_useragent.middleware.RetryUserAgentMiddleware': 401,
}

Configuring User-Agent type

There’s a configuration parameter RANDOM_UA_TYPE defaulting to random which is passed verbatim to the fake-user-agent. Therefore you can set it to say firefox to mimic only firefox browsers. Most useful though would be to use desktop or mobile values to send desktop or mobile strings respectively.

Usage with scrapy-proxies

To use with middlewares of random proxy such as scrapy-proxies, you need:

  1. set RANDOM_UA_PER_PROXY to True to allow switch per proxy

  2. set priority of RandomUserAgentMiddleware to be greater than scrapy-proxies, so that proxy is set before handle UA

Configuring Fake-UserAgent fallback

There’s a configuration parameter FAKEUSERAGENT_FALLBACK defaulting to None. You can set it to a string value, for example Mozilla or Your favorite browser, this configuration can completely disable any annoying exception that may happen if fake-useragent failed to retrieve a random UA string.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

scrapy-fake-useragent-1.2.0.tar.gz (3.4 kB view hashes)

Uploaded Source

Built Distribution

scrapy_fake_useragent-1.2.0-py2.py3-none-any.whl (4.8 kB view hashes)

Uploaded Python 2 Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page