Python implementation for Gradient COBRA by S. Has (2023) with other aggregation and kernel methods.
Project description
Introduction
gradientcobra is the python package implementation of Gradient COBRA method by S. Has (2023), which is a kernel-based consensual aggregation method for regression problems. It is a regular kernel-based version of COBRA method by Biau et al. (2016). We have theoretically shown that the consistency inheritance property also holds for this kernel-based configuration, and the same convergence rate as classical COBRA is achieved. Moreoever, gradient descent algorithm is applied to efficiently estimate the bandwidth parameter of the method. This efficiency is illustrated in several numerical experiments on simulated and real datasets.
For more information, read the “Documentation and Examples” below.
Installation
In your terminal, run the following command to download and install from PyPI:
pip install gradientcobra
Citation
If you find gradientcobra helpful, please consider citing the following papaers:
S. Has (2023), Gradient COBRA: A kernel-based consensual aggregation for regression.
A. Fischer and M. Mougeot (2019), Aggregation using input-output trade-off.
G. Biau, A. Fischer, B. Guedj and J. D. Malley (2016), COBRA: A combined regression strategy.
Documentation and Examples
For more information and how to use the package, read gradientcobra documentation.
Read also:
Dependencies
Python 3.9+
numpy, scipy, scikit-learn, matplotlib, pandas, seaborn, plotly
References
S. Has (2023). A Gradient COBRA: A kernel-based consensual aggregation for regression. Journal of Data Science, Statistics, and Visualisation, 3(2).
A. Fischer, M. Mougeot (2019). Aggregation using input-output trade-off. Journal of Statistical Planning and Inference, 200.
G. Biau, A. Fischer, B. Guedj and J. D. Malley (2016), COBRA: A combined regression strategy, Journal of Multivariate Analysis.
M. Mojirsheibani (1999), Combining Classifiers via Discretization, Journal of the American Statistical Association.
M. J. Van der Laan, E. C. Polley, and A. E. Hubbard (2007). Super Learner. Statistical Applications of Genetics and Molecular Biology, 6, article 25.
T. Hastie, R. Tibshirani, J. Friedman (2009). Kernel Smoothing Methods. The Elements of Statistical Learning. Springer Series in Statistics. Springer, New York, NY.
Project details
Release history Release notifications | RSS feed
Download files
Download the file for your platform. If you're not sure which to choose, learn more about installing packages.
Source Distribution
Built Distribution
Hashes for gradientcobra-1.0.12-py3-none-any.whl
Algorithm | Hash digest | |
---|---|---|
SHA256 | fffd9ccfec0491bbd84d43b7274fb2c2dc0e32507d7783278c112a8f74215fed |
|
MD5 | dce2bbd4397a7bdb55561767a188ade6 |
|
BLAKE2b-256 | 9e172ebf93d87eb6981a408ca925c30709dec6991975d6c3326dcc0cc82d0787 |