Skip to main content

A clean and simple library for Continual Learning in PyTorch.

Project description

Continuum

PyPI version Build Status Codacy Badge DOI Documentation Status

A library for PyTorch's loading of datasets in the field of Continual Learning

Aka Continual Learning, Lifelong-Learning, Incremental Learning, etc.

Read the documentation.

Example:

Install from and PyPi:

pip3 install continuum

And run!

from torch.utils.data import DataLoader

from continuum import ClassIncremental, split_train_val
from continuum.datasets import MNIST

clloader = ClassIncremental(
    MNIST("my/data/path", download=True),
    increment=1,
    initial_increment=5,
    train=True  # a different loader for test
)

print(f"Number of classes: {clloader.nb_classes}.")
print(f"Number of tasks: {clloader.nb_tasks}.")

for task_id, train_dataset in enumerate(clloader):
    train_dataset, val_dataset = split_train_val(train_dataset, val_split=0.1)
    train_loader = DataLoader(train_dataset)
    val_loader = DataLoader(val_dataset)

    for x, y, t in train_loader:
        # Do your cool stuff here

Supported Scenarios

Name Acronym  Supported
New Instances  NI :white_check_mark:
New Classes  NC :white_check_mark:
New Instances & Classes  NIC :white_check_mark:

Supported Datasets:

Note that the task sizes are fully customizable.

Name Nb classes  Image Size Automatic Download Type
MNIST 10  28x28x1 :white_check_mark: :eyes:
Fashion MNIST 10  28x28x1 :white_check_mark: :eyes:
KMNIST 10  28x28x1 :white_check_mark: :eyes:
EMNIST 10  28x28x1 :white_check_mark: :eyes:
QMNIST 10  28x28x1 :white_check_mark: :eyes:
MNIST Fellowship 30  28x28x1 :white_check_mark: :eyes:
CIFAR10 10 32x32x3 :white_check_mark: :eyes:
CIFAR100 100 32x32x3 :white_check_mark: :eyes:
CIFAR Fellowship 110 32x32x3 :white_check_mark: :eyes:
ImageNet100 100 224x224x3 :x: :eyes:
ImageNet1000 1000 224x224x3 :x: :eyes:
Permuted MNIST 10 28x28x1 :white_check_mark: :eyes:
Rotated MNIST 10 28x28x1 :white_check_mark: :eyes:
CORe50 50 224x224x3 :white_check_mark: :eyes:
CORe50-v2-79 50 224x224x3 :white_check_mark: :eyes:
CORe50-v2-196 50 224x224x3 :white_check_mark: :eyes:
CORe50-v2-391 50 224x224x3 :white_check_mark: :eyes:
MultiNLI  5   :white_check_mark:  :book:

Furthermore some "Meta"-datasets are available:

InMemoryDataset, for in-memory numpy array:

x_train, y_train = gen_numpy_array()
x_test, y_test = gen_numpy_array()

clloader = CLLoader(
    InMemoryDataset(x_train, y_train, x_test, y_test),
    increment=10,
)

PyTorchDataset,for any dataset defined in torchvision:

clloader = CLLoader(
    PyTorchDataset("/my/data/path", dataset_type=torchvision.datasets.CIFAR10),
    increment=10,
)

ImageFolderDataset, for datasets having a tree-like structure, with one folder per class:

clloader = CLLoader(
    ImageFolderDataset("/my/train/folder", "/my/test/folder"),
    increment=10,
)

Fellowship, to combine several continual datasets.:

clloader = CLLoader(
    Fellowship("/my/data/path", dataset_list=[CIFAR10, CIFAR100]),
    increment=10,
)

Some datasets cannot provide an automatic download of the data for miscealleneous reasons. For example for ImageNet, you'll need to download the data from the official page. Then load it likewise:

clloader = CLLoader(
    ImageNet1000("/my/train/folder", "/my/test/folder"),
    increment=10,
)

Some papers use a subset, called ImageNet100 or ImageNetSubset. They are automatically downloaded for you, but you can also provide your own.

Continual Loader

Class Incremental

The Continual Loader ClassIncremental loads the data and batch it in several tasks, each with new classes. See there some example arguments:

from continuum import ClassIncremental

clloader = ClassIncremental(
    my_continual_dataset,
    increment=10,
    initial_increment=2,
    train_transformations=[transforms.RandomHorizontalFlip()],
    common_transformations=[
        transforms.ToTensor(),
        transforms.Normalize((0.4914, 0.4822, 0.4465), (0.2023, 0.1994, 0.2010))
    ],
    train=True
)

Here the first task is made of 2 classes, then all following tasks of 10 classes. You can have a more finegrained increment by providing a list of increment=[2, 10, 5, 10].

The train_transformations is applied only on the training data, while the common_transformations on both the training and testing data.

If you want a clloader for the test data, you'll need to create a new instance with train=False.

Instance Incremental

Tasks can also be made of new instances. By default the samples images are randomly shuffled in different tasks, but some datasets provide, in addition of the data x and labels y, a task id t per sample. For example MultiNLI, a NLP dataset, has 5 classes but with 10 different domains. Each domain represents a new task.

from continuum import InstanceIncremental
from continuum.datasets import MultiNLI

clloader = InstanceIncremental(
    MultiNLI("/my/path/where/to/download"),
    train=True
)

New Class & Instance

NIC settting is a special case of NI setting. For now, only the CORe50 dataset supports this setting.

Indexing

All our continual loader are iterable (i.e. you can for loop on them), and are also indexable.

Meaning that clloader[2] returns the third task (index starts at 0). Likewise, if you want to evaluate after each task, on all seen tasks do clloader_test[:n].

Sample Images

MNIST:

Task 0 Task 1 Task 2 Task 3 Task 4

FashionMNIST:

Task 0 Task 1 Task 2 Task 3 Task 4

CIFAR10:

Task 0 Task 1 Task 2 Task 3 Task 4

MNIST Fellowship (MNIST + FashionMNIST + KMNIST):

Task 0 Task 1 Task 2

PermutedMNIST:

Task 0 Task 1 Task 2 Task 3 Task 4

RotatedMNIST:

Task 0 Task 1 Task 2 Task 3 Task 4

ImageNet100:

...
Task 0 Task 1 Task 2 Task 3 ...

Citation

If you find this library useful in your work, please consider citing it:

@misc{douillardlesort2020continuum,
  author={Douillard, Arthur and Lesort, Timothée},
  title={Continuum, Data Loaders for Continual Learning},
  howpublished={https://github.com/Continvvm/continuum},
  year={2020},
  doi={10.5281/zenodo.3759673}
}

Maintainers

This project was started by a joint effort from Arthur Douillard & Timothée Lesort.

Feel free to contribute! If you want to propose new features, please create an issue.

On PyPi

Our project is available on PyPi!

pip3 install continuum

Note that previously another project, a CI tool, was using that name. It is now there continuum_ci.

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distributions

No source distribution files available for this release.See tutorial on generating distribution archives.

Built Distribution

continuum-0.1.2-py3-none-any.whl (49.4 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page