Skip to main content

A tool to archive the direct messages from your private conversations on Twitter.

Project description

# DMArchiver
A tool to archive **all** the direct messages from your private conversations on Twitter.

## Introduction
Have you ever need to retrieve old information from a chat with your friends on Twitter? Or maybe you would just like to backup all these cheerful moments and keep them safe.

I have made this tool to retrieve all the tweets from my private conversations and transform them in an _IRC-like_ log for archiving.

**Output sample:**
```
[2016-09-07 10:35:55] <Michael> [Media-image] https://ton.twitter.com/1.1/ton/data/dm/773125478562429059/773401254876366208/mfeDmXXj.jpg I am so a Dexter fan...
[2016-09-07 10:37:12] <Kathy> He is so sexy. [Flushed face] I love him. [Heavy red heart]
[2016-09-07 10:38:10] <Steve> You guys are ridiculous! [Face with tears of joy]
```

Emoji are currently kept with their description to prevent encoding issues.

This tool is also able to **download all the uploaded images** in their original resolution and, as a bonus, also retrieve the **GIFs** you used in your conversation as MP4 files (the format used by Twitter to optimized them and save space).

You may have found suggestions to use the Twitter's archive feature to do the same but Direct Messages are not included in the generated archive.

The script does not leverage the Twitter API because of its very restrictive limitations in regard of the handling of the Direct Messages. Actually, it is currently possible to retrieve only the latest 200 messages of a private conversation.

Because it is still possible to retrieve older messages from a Conversation by scrolling up, this script only simulates this behavior to automatically get the messages.

**Disclaimer:**
Using this tool will only behave like you using the Twitter web site with your browser, so there is nothing illegal to use it to retrieve your own data. However, depending on your conversations' length, it may trigger a lot of requests to the site that could be suspicious for Twitter. No one are reported issues upon now but use it at your own risk.

## Installation & Quick start
### Ubuntu
Python 3 should be already there.

```
$ pip install dmarchiver
$ dmarchiver
```
### Windows
Download a Windows build from the [project releases](https://github.com/Mincka/DMArchiver/releases).

Then run the tool in a Command Prompt.
```
> C:\Temp\DMArchiver.exe
```

### macOS
You may need to install Python 3.

```
$ brew install python3
$ pip3 install dmarchiver
$ dmarchiver
```

## Upgrade
```
$ pip install dmarchiver --upgrade
```

## Usage

### Command line tool
```
$ dmarchiver [-h] [-id CONVERSATION_ID] [-di] [-dg]

$ dmarchiver --help
usage: cmdline.py [-h] [-id CONVERSATION_ID] [-di] [-dg]

optional arguments:
-h, --help show this help message and exit
-id CONVERSATION_ID, --conversation_id CONVERSATION_ID
Conversation ID
-di, --download-images
Download images
-dg, --download-gifs Download GIFs (as MP4)
```

### Examples

#### Archive all conversations with images and GIFs:
`$ dmarchiver -di -dg`

The script output will be the `645754097571131337.txt` file with the conversation formatted in an _IRC-like_ style.

The images and GIFs files can be respectively found in the `645754097571131337/images` and `645754097571131337/mp4` folders.

#### Archive a specific conversation without images and GIFs:
To retrieve only one conversation with the ID `645754097571131337`:

`$ dmarchiver -id "645754097571131337" -di -dg`

The script output will be the `645754097571131337.txt` file with the conversation formatted in an _IRC-like_ style.

#### How to get a `conversation_id`?

The `conversation-id` is the identifier of the conversation you want to backup. Here how to find it manually:
- Open the _Network_ panel in the _Development Tools_ of your favorite browser.
- Open the desired conversation on Twitter and have a look at the requests.
- Identify a request with the following arguments:
`https://twitter.com/messages/with/conversation?id=645754097571131337&max_entry_id=78473919348771337`
- Use the `id` value as your `conversation-id`. This identifier can contain special characters such as '-'.

### Module import
```python
>>> from dmarchiver.core import Crawler
>>> crawler = Crawler()
>>> crawler.authenticate('username', 'password')
>>> crawler.crawl('conversation_id')
```

## Development setup
```shell
$ git clone https://github.com/Mincka/DMArchiver.git
$ cd dmarchiver
$ virtualenv venv
$ source venv/bin/activate # "venv/Scripts/Activate.bat" on Windows
$ pip install -r requirements.txt
$ python setup.py install
```

### Windows binary build
You can build it with `pyinstaller`.

```
> pip install pyinstaller
> pyinstaller --onefile dmarchiver\cmdline.py -n dmarchiver.exe
> cd dist
> dmarchiver.exe
```

## Troubleshooting
You may encounter building issues with the `lxml` library on Windows. The most simple and straightforward fix is to download and install a precompiled binary from [this site](http://www.lfd.uci.edu/~gohlke/pythonlibs/#lxml) and install the package locally:

`$ pip install lxml-3.6.4-cp35-cp35m-win_amd64.whl`

## License

Copyright (C) 2016 Julien EHRHART

This program is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.

This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.

You should have received a copy of the GNU General Public License
along with this program. If not, see <http://www.gnu.org/licenses/>.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

dmarchiver-0.0.7.zip (16.9 kB view hashes)

Uploaded Source

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page