Skip to main content

Implementation of Dynamic Ensemble Selection methods

Project description

Docs CircleCI BSD PyPi PythonVersion Downloads Wheel Black

DESlib

DESlib is an easy-to-use ensemble learning library focused on the implementation of the state-of-the-art techniques for dynamic classifier and ensemble selection. The library is is based on scikit-learn, using the same method signatures: fit, predict, predict_proba and score. All dynamic selection techniques were implemented according to the definitions from [1].

Dynamic Selection:

Dynamic Selection (DS) refers to techniques in which the base classifiers are selected dynamically at test time, according to each new sample to be classified. Only the most competent, or an ensemble of the most competent classifiers is selected to predict the label of a specific test sample. The rationale for these techniques is that not every classifier in the pool is an expert in classifying all unknown samples, but rather each base classifier is an expert in a different local region of the feature space.

DS is one of the most promising MCS approaches (Multiple Classifier Systems) due to an increasing number of empirical studies reporting superior performance over static combination methods. Such techniques have achieved better classification performance especially when dealing with small-sized and imbalanced datasets.

Installation:

The package can be installed using pip:

Stable version:

pip install deslib

Latest version (under development):

pip install git+https://github.com/scikit-learn-contrib/DESlib

Dependencies:

The dependency requirements are:

  • Python (>= 3.7)

  • NumPy (>= 1.17.3)

  • SciPy (>= 1.5.0)

  • Scikit-learn (>= 1.0.2)

These dependencies are automatically installed using the pip commands above.

Examples:

Here we show an example using the KNORA-E method with random forest as a pool of classifiers:

from deslib.des.knora_e import KNORAE

# Train a pool of 10 classifiers
pool_classifiers = RandomForestClassifier(n_estimators=10)
pool_classifiers.fit(X_train, y_train)

# Initialize the DES model
knorae = KNORAE(pool_classifiers)

# Preprocess the Dynamic Selection dataset (DSEL)
knorae.fit(X_dsel, y_dsel)

# Predict new examples:
knorae.predict(X_test)

The library accepts any list of classifiers (compatible with scikit-learn) as input, including a list containing different classifier models (heterogeneous ensembles). More examples on how to use the API can be found in the documentation and in the Examples directory.

Organization:

The library is divided into four modules:

  1. deslib.des: Implementation of DES techniques (Dynamic Ensemble Selection).

  2. deslib.dcs: Implementation of DCS techniques (Dynamic Classifier Selection).

  3. deslib.static: Implementation of baseline ensemble methods.

  4. deslib.util: A collection of aggregation functions and diversity measures for ensemble of classifiers.

  • DES techniques currently available are:
    1. META-DES [7] [8] [15]

    2. K-Nearest-Oracle-Eliminate (KNORA-E) [3]

    3. K-Nearest-Oracle-Union (KNORA-U) [3]

    4. Dynamic Ensemble Selection-Performance(DES-P) [12]

    5. K-Nearest-Output Profiles (KNOP) [9]

    6. Randomized Reference Classifier (DES-RRC) [10]

    7. DES Kullback-Leibler Divergence (DES-KL) [12]

    8. DES-Exponential [21]

    9. DES-Logarithmic [11]

    10. DES-Minimum Difference [21]

    11. DES-Clustering [16]

    12. DES-KNN [16]

    13. DES Multiclass Imbalance (DES-MI) [24]

  • DCS techniques currently available are:
    1. Modified Classifier Rank (Rank) [19]

    2. Overall Local Accuracy (OLA) [4]

    3. Local Class Accuracy (LCA) [4]

    4. Modified Local Accuracy (MLA) [23]

    5. Multiple Classifier Behaviour (MCB) [5]

    6. A Priori Selection (A Priori) [6]

    7. A Posteriori Selection (A Posteriori) [6]

  • Baseline methods:
    1. Oracle [20]

    2. Single Best [2]

    3. Static Selection [2]

    4. Stacked Classifier [25]

Variations of each DES techniques are also provided by the library (e.g., different versions of the META-DES framework).

The following techniques are also available for all methods:
  • For DES techniques, the combination of the selected classifiers can be done as Dynamic Selection (majority voting), Dynamic Weighting (weighted majority voting) or a Hybrid (selection + weighting).

  • For all DS techniques, Dynamic Frienemy Pruning (DFP) [13] can be used.

  • For all DS techniques, Instance Hardness (IH) can be used to classify easy samples with a KNN and hard samples using the DS technique. More details on IH and Dynamic Selection can be found in [14].

As an optional requirement, the fast KNN implementation from FAISS can be used to speed-up the computation of the region of competence on GPU.

Citation

If you use DESLib in a scientific paper, please consider citing the following paper:

Rafael M. O. Cruz, Luiz G. Hafemann, Robert Sabourin and George D. C. Cavalcanti DESlib: A Dynamic ensemble selection library in Python. arXiv preprint arXiv:1802.04967 (2018).

@article{JMLR:v21:18-144,
    author  = {Rafael M. O. Cruz and Luiz G. Hafemann and Robert Sabourin and George D. C. Cavalcanti},
    title   = {DESlib: A Dynamic ensemble selection library in Python},
    journal = {Journal of Machine Learning Research},
    year    = {2020},
    volume  = {21},
    number  = {8},
    pages   = {1-5},
    url     = {http://jmlr.org/papers/v21/18-144.html}
}

References:

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

DESlib-0.3.7.tar.gz (132.7 kB view hashes)

Uploaded Source

Built Distribution

DESlib-0.3.7-py3-none-any.whl (172.6 kB view hashes)

Uploaded Python 3

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page