Skip to main content

Python implementations of preprocesssing imbalanced data for regression

Project description

Imbalanced Learning Regression

PyPI version License: GPL v3 GitHub last commit

Description

A Python implementation of sampling techniques for Regression. Conducts different sampling techniques for Regression. Useful for prediction problems where regression is applicable, but the values in the interest of predicting are rare or uncommon. This can also serve as a useful alternative to log transforming a skewed response variable, especially if generating synthetic data is also of interest.

Features

  1. An open-source Python supported version of sampling techniques for Regression, a variation of Nick Kunz's package SMOGN.

  2. Supports Pandas DataFrame inputs containing mixed data types.

  3. Flexible inputs available to control the areas of interest within a continuous response variable and friendly parameters for re-sampling data.

  4. Purely Pythonic, developed for consistency, maintainability, and future improvement, no foreign function calls to C or Fortran, as contained in original R implementation.

Requirements

  1. Python 3
  2. NumPy
  3. Pandas
  4. Scikit-learn

Installation

## install pypi release
pip install ImbalancedLearningRegression

## install developer version
pip install git+https://github.com/paobranco/ImbalancedLearningRegression.git

Usage

## load libraries
import pandas
import ImbalancedLearningRegression as iblr

## load data
housing = pandas.read_csv(
    ## http://jse.amstat.org/v19n3/decock.pdf
    "https://raw.githubusercontent.com/paobranco/ImbalancedLearningRegression/master/data/housing.csv"
)

## conduct Random Over-sampling
housing_ro = iblr.ro(
    data = housing, 
    y = "SalePrice"
)

## conduct Introduction of Gaussian Noise
housing_gn = iblr.gn(
    data = housing, 
    y = "SalePrice"
)

Examples

  1. Random Over-sampling
  2. Introduction of Gaussian Noise
  3. Condensed Nearest Neighbor
  4. Edited Nearest Neighbor

For the examples of other techniques, please refer to here.

Documentation

The documentation of the package can be found here.

License

© Paula Branco, 2022. Licensed under the General Public License v3.0 (GPLv3).

Contributions

ImbalancedLearningRegression is open for improvements and maintenance. Your help is valued to make the package better for everyone.

Reference

Branco, P., Torgo, L., Ribeiro, R. (2017). SMOGN: A Pre-Processing Approach for Imbalanced Regression. Proceedings of Machine Learning Research, 74:36-50. http://proceedings.mlr.press/v74/branco17a/branco17a.pdf

Branco, P., Torgo, L., & Ribeiro, R. P. (2019). Pre-processing approaches for imbalanced distributions in regression. Neurocomputing, 343, 76-99. https://www.sciencedirect.com/science/article/abs/pii/S0925231219301638

Chawla, N. V., Bowyer, K. W., Hall, L. O., & Kegelmeyer, W. P. (2002). SMOTE: synthetic minority over-sampling technique. Journal of artificial intelligence research, 16, 321-357. https://www.jair.org/index.php/jair/article/view/10302

Elhassan, T., & Aljurf, M. (2016). Classification of imbalance data using tomek link (t-link) combined with random under-sampling (rus) as a data reduction method. Global J Technol Optim S, 1. https://www.researchgate.net/profile/Mohamed-Shoukri-2/publication/326590590_Classification_of_Imbalance_Data_using_Tomek_Link_T-Link_Combined_with_Random_Under-sampling_RUS_as_a_Data_Reduction_Method/links/5b96a6a0a6fdccfd543cbc40/Classification-of-Imbalance-Data-using-Tomek-Link-T-Link-Combined-with-Random-Under-sampling-RUS-as-a-Data-Reduction-Method.pdf

Hart, P. (1968). The condensed nearest neighbor rule (corresp.). IEEE transactions on information theory, 14(3), 515-516. https://ieeexplore.ieee.org/document/1054155

He, H., Bai, Y., Garcia, E. A., & Li, S. (2008, June). ADASYN: Adaptive synthetic sampling approach for imbalanced learning. In 2008 IEEE international joint conference on neural networks (IEEE world congress on computational intelligence) (pp. 1322-1328). IEEE. https://www.ele.uri.edu/faculty/he/PDFfiles/adasyn.pdf

Kunz, N., (2019). SMOGN. https://github.com/nickkunz/smogn

Menardi, G., & Torelli, N. (2014). Training and assessing classification rules with imbalanced data. Data mining and knowledge discovery, 28(1), 92-122. https://link.springer.com/article/10.1007/s10618-012-0295-5

Tomek, I. (1976). Two modifications of CNN. IEEE Trans. Systems, Man and Cybernetics, 6, 769-772. https://ieeexplore.ieee.org/document/4309452

Torgo, L., Ribeiro, R. P., Pfahringer, B., & Branco, P. (2013, September). Smote for regression. In Portuguese conference on artificial intelligence (pp. 378-389). Springer, Berlin, Heidelberg. https://link.springer.com/chapter/10.1007/978-3-642-40669-0_33

Wilson, D. L. (1972). Asymptotic properties of nearest neighbor rules using edited data. IEEE Transactions on Systems, Man, and Cybernetics, (3), 408-421. https://ieeexplore.ieee.org/abstract/document/4309137

Project details


Download files

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Source Distribution

ImbalancedLearningRegression-0.0.1.tar.gz (48.8 kB view hashes)

Uploaded Source

Built Distribution

Supported by

AWS AWS Cloud computing and Security Sponsor Datadog Datadog Monitoring Fastly Fastly CDN Google Google Download Analytics Microsoft Microsoft PSF Sponsor Pingdom Pingdom Monitoring Sentry Sentry Error logging StatusPage StatusPage Status page