Skip to content

Commit

Permalink
Change the package name
Browse files Browse the repository at this point in the history
  • Loading branch information
Guillaume Lemaitre committed Jun 26, 2016
1 parent 1c30fbe commit b6e015e
Show file tree
Hide file tree
Showing 203 changed files with 109 additions and 105 deletions.
4 changes: 2 additions & 2 deletions .coveragerc
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,8 @@

[run]
branch = True
source = unbalanced_dataset
include = */unbalanced_dataset/*
source = imblearn
include = */imblearn/*
omit =
*/setup.py

Expand Down
2 changes: 1 addition & 1 deletion .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ env:
global:
# Directory where tests are run from
- TEST_DIR=/tmp/test_dir
- MODULE=unbalanced_dataset
- MODULE=imblearn
- OMP_NUM_THREADS=4
- OPENBLAS_NUM_THREADS=4
matrix:
Expand Down
6 changes: 3 additions & 3 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -17,13 +17,13 @@ clean:
rm -rf examples/.ipynb_checkpoints

test:
$(NOSETESTS) -s -v unbalanced_dataset
$(NOSETESTS) -s -v imblearn

# doctest:
# $(PYTHON) -c "import unbalanced_dataset, sys, io; sys.exit(unbalanced_dataset.doctest_verbose())"
# $(PYTHON) -c "import imblearn, sys, io; sys.exit(imblearn.doctest_verbose())"

coverage:
$(NOSETESTS) unbalanced_dataset -s -v --with-coverage --cover-package=unbalanced_dataset
$(NOSETESTS) imblearn -s -v --with-coverage --cover-package=imblearn

html:
conda install -y sphinx sphinx_rtd_theme numpydoc
Expand Down
10 changes: 5 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
UnbalancedDataset
imbalanced-learn
=================

UnbalancedDataset is a python package offering a number of re-sampling techniques commonly used in datasets showing strong between-class imbalance.
imbalanced-learn is a python package offering a number of re-sampling techniques commonly used in datasets showing strong between-class imbalance.
It is compatible with [scikit-learn](http://scikit-learn.org/stable/) and has been submitted to be part of [scikit-learn-contrib](https://github.com/scikit-learn-contrib) projects.

[![Code Health](https://landscape.io/github/glemaitre/UnbalancedDataset/master/landscape.svg?style=flat)](https://landscape.io/github/glemaitre/UnbalancedDataset/master)
Expand All @@ -19,15 +19,15 @@ Installation

### Dependencies

UnbalancedDataset is tested to work under Python 2.7 and Python 3.5.
imbalanced-learn is tested to work under Python 2.7 and Python 3.5.

* scipy(>=0.17.0)
* numpy(>=1.10.4)
* scikit-learn(>=0.17.1)

### Installation

UnbalancedDataset is not currently available on the PyPi's reporitories,
imbalanced-learn is not currently available on the PyPi's reporitories,
however you can install it via `pip`:

pip install git+https://github.com/fmfn/UnbalancedDataset
Expand Down Expand Up @@ -60,7 +60,7 @@ Re-sampling techniques are divided in two categories:
3. Combining over- and under-sampling.
4. Create ensemble balanced sets.

Bellow is a list of the methods currently implemented in this module.
Below is a list of the methods currently implemented in this module.

* Under-sampling
1. Random majority under-sampling with replacement
Expand Down
32 changes: 18 additions & 14 deletions doc/api.rst
Original file line number Diff line number Diff line change
@@ -1,21 +1,21 @@
#################
API Documentation
#################
######################
`imbalanced-learn` API
######################

This is the full API documentation of the `unbalanced_dataset` toolbox.
This is the full API documentation of the `imbalanced-learn` toolbox.

.. _under_sampling_ref:

Under-sampling methods
======================

.. automodule:: unbalanced_dataset.under_sampling
.. automodule:: imblearn.under_sampling
:no-members:
:no-inherited-members:

Classes
-------
.. currentmodule:: unbalanced_dataset
.. currentmodule:: imblearn

.. autosummary::
:toctree: generated/
Expand All @@ -37,13 +37,13 @@ Classes
Over-sampling methods
=====================

.. automodule:: unbalanced_dataset.over_sampling
.. automodule:: imblearn.over_sampling
:no-members:
:no-inherited-members:

Classes
-------
.. currentmodule:: unbalanced_dataset
.. currentmodule:: imblearn

.. autosummary::
:toctree: generated/
Expand All @@ -57,13 +57,13 @@ Classes
Combination of over- and under-sampling methods
===============================================

.. automodule:: unbalanced_dataset.combine
.. automodule:: imblearn.combine
:no-members:
:no-inherited-members:

Classes
-------
.. currentmodule:: unbalanced_dataset
.. currentmodule:: imblearn

.. autosummary::
:toctree: generated/
Expand All @@ -77,13 +77,13 @@ Classes
Ensemble methods
================

.. automodule:: unbalanced_dataset.ensemble
.. automodule:: imblearn.ensemble
:no-members:
:no-inherited-members:

Classes
-------
.. currentmodule:: unbalanced_dataset
.. currentmodule:: imblearn

.. autosummary::
:toctree: generated/
Expand All @@ -97,17 +97,21 @@ Classes
Pipeline
========

.. automodule:: unbalanced_dataset.pipeline
.. automodule:: imblearn.pipeline
:no-members:
:no-inherited-members:

.. currentmodule:: unbalanced_dataset
.. currentmodule:: imblearn

Classes
-------
.. autosummary::
:toctree: generated/

pipeline.Pipeline

Functions
---------
.. autosummary::
:toctree: generated/

Expand Down
4 changes: 2 additions & 2 deletions doc/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -79,7 +79,7 @@
master_doc = 'index'

# General information about the project.
project = u'unbalanced_dataset'
project = u'imbalanced-learn'
copyright = u'2016, Guillaume Lemaitre, Fernando Nogueira'

# The version info for the project you're documenting, acts as replacement for
Expand All @@ -89,7 +89,7 @@
# The short X.Y version.
version = '0.1'
# The full version, including alpha/beta/rc tags.
release = '0.1.dev0'
release = '0.1'

# The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages.
Expand Down
6 changes: 3 additions & 3 deletions doc/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,9 +3,9 @@
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
############################################
Welcome to unbalanced_dataset focumentation!
############################################
##########################################
Welcome to imbalanced-learn documentation!
##########################################

Contents:
=========
Expand Down
2 changes: 1 addition & 1 deletion doc/install.rst
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ Getting Started
Install
=======

The install of ``UnbalancedDataset`` is almost straightforward. You need to clone it from GitHub_::
The install of ``imbalanced-learn`` is almost straightforward. You need to clone it from GitHub_::

$ git clone https://github.com/fmfn/UnbalancedDataset.git
$ python setup.py install
Expand Down
2 changes: 1 addition & 1 deletion examples/README.txt
Original file line number Diff line number Diff line change
Expand Up @@ -3,4 +3,4 @@
General examples
----------------

General-purpose and introductory examples for the unbalanced_dataset.
General-purpose and introductory examples for the `imbalanced-learn` toolbox.
2 changes: 1 addition & 1 deletion examples/combine/plot_smote_enn.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
from sklearn.datasets import make_classification
from sklearn.decomposition import PCA

from unbalanced_dataset.combine import SMOTEENN
from imblearn.combine import SMOTEENN

# Generate the dataset
X, y = make_classification(n_classes=2, class_sep=2, weights=[0.1, 0.9],
Expand Down
2 changes: 1 addition & 1 deletion examples/combine/plot_smote_tomek.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
from sklearn.datasets import make_classification
from sklearn.decomposition import PCA

from unbalanced_dataset.combine import SMOTETomek
from imblearn.combine import SMOTETomek

# Generate the dataset
X, y = make_classification(n_classes=2, class_sep=2, weights=[0.1, 0.9],
Expand Down
2 changes: 1 addition & 1 deletion examples/ensemble/plot_balance_cascade.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
from sklearn.datasets import make_classification
from sklearn.decomposition import PCA

from unbalanced_dataset.ensemble import BalanceCascade
from imblearn.ensemble import BalanceCascade

# Generate the dataset
X, y = make_classification(n_classes=2, class_sep=2, weights=[0.1, 0.9],
Expand Down
2 changes: 1 addition & 1 deletion examples/ensemble/plot_easy_ensemble.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@
from sklearn.datasets import make_classification
from sklearn.decomposition import PCA

from unbalanced_dataset.ensemble import EasyEnsemble
from imblearn.ensemble import EasyEnsemble

# Generate the dataset
X, y = make_classification(n_classes=2, class_sep=2, weights=[0.1, 0.9],
Expand Down
2 changes: 1 addition & 1 deletion examples/over-sampling/plot_random_over_sampling.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
from sklearn.datasets import make_classification
from sklearn.decomposition import PCA

from unbalanced_dataset.over_sampling import RandomOverSampler
from imblearn.over_sampling import RandomOverSampler

# Generate the dataset
X, y = make_classification(n_classes=2, class_sep=2, weights=[0.1, 0.9],
Expand Down
2 changes: 1 addition & 1 deletion examples/over-sampling/plot_smote.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
from sklearn.datasets import make_classification
from sklearn.decomposition import PCA

from unbalanced_dataset.over_sampling import SMOTE
from imblearn.over_sampling import SMOTE

# Generate the dataset
X, y = make_classification(n_classes=2, class_sep=2, weights=[0.1, 0.9],
Expand Down
2 changes: 1 addition & 1 deletion examples/over-sampling/plot_smote_bordeline_1.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
from sklearn.datasets import make_classification
from sklearn.decomposition import PCA

from unbalanced_dataset.over_sampling import SMOTE
from imblearn.over_sampling import SMOTE

# Generate the dataset
X, y = make_classification(n_classes=2, class_sep=2, weights=[0.1, 0.9],
Expand Down
2 changes: 1 addition & 1 deletion examples/over-sampling/plot_smote_bordeline_2.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
from sklearn.datasets import make_classification
from sklearn.decomposition import PCA

from unbalanced_dataset.over_sampling import SMOTE
from imblearn.over_sampling import SMOTE

# Generate the dataset
X, y = make_classification(n_classes=2, class_sep=2, weights=[0.1, 0.9],
Expand Down
2 changes: 1 addition & 1 deletion examples/over-sampling/plot_smote_svm.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
from sklearn.datasets import make_classification
from sklearn.decomposition import PCA

from unbalanced_dataset.over_sampling import SMOTE
from imblearn.over_sampling import SMOTE

# Generate the dataset
X, y = make_classification(n_classes=2, class_sep=2, weights=[0.1, 0.9],
Expand Down
6 changes: 3 additions & 3 deletions examples/pipeline/plot_pipeline_classification.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,9 +17,9 @@
from sklearn.metrics import classification_report


from unbalanced_dataset.pipeline import make_pipeline
from unbalanced_dataset.under_sampling import EditedNearestNeighbours, \
RepeatedEditedNearestNeighbours
from imblearn.pipeline import make_pipeline
from imblearn.under_sampling import EditedNearestNeighbours
from imblearn.under_sampling import RepeatedEditedNearestNeighbours

# Generate the dataset
X, y = make_classification(n_classes=2, class_sep=1.25, weights=[0.3, 0.7],
Expand Down
2 changes: 1 addition & 1 deletion examples/under-sampling/plot_cluster_centroids.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
from sklearn.datasets import make_classification
from sklearn.decomposition import PCA

from unbalanced_dataset.under_sampling import ClusterCentroids
from imblearn.under_sampling import ClusterCentroids

# Generate the dataset
X, y = make_classification(n_classes=2, class_sep=2, weights=[0.1, 0.9],
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
from sklearn.datasets import make_classification
from sklearn.decomposition import PCA

from unbalanced_dataset.under_sampling import CondensedNearestNeighbour
from imblearn.under_sampling import CondensedNearestNeighbour

# Generate the dataset
X, y = make_classification(n_classes=2, class_sep=2, weights=[0.1, 0.9],
Expand Down
2 changes: 1 addition & 1 deletion examples/under-sampling/plot_edited_nearest_neighbours.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
from sklearn.datasets import make_classification
from sklearn.decomposition import PCA

from unbalanced_dataset.under_sampling import EditedNearestNeighbours
from imblearn.under_sampling import EditedNearestNeighbours

# Generate the dataset
X, y = make_classification(n_classes=2, class_sep=2, weights=[0.1, 0.9],
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
from sklearn.datasets import make_classification
from sklearn.decomposition import PCA

from unbalanced_dataset.under_sampling import InstanceHardnessThreshold
from imblearn.under_sampling import InstanceHardnessThreshold

# Generate the dataset
X, y = make_classification(n_classes=2, class_sep=1., weights=[0.05, 0.95],
Expand Down
2 changes: 1 addition & 1 deletion examples/under-sampling/plot_nearmiss_1.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
from sklearn.datasets import make_classification
from sklearn.decomposition import PCA

from unbalanced_dataset.under_sampling import NearMiss
from imblearn.under_sampling import NearMiss

# Generate the dataset
X, y = make_classification(n_classes=2, class_sep=2, weights=[0.1, 0.9],
Expand Down
2 changes: 1 addition & 1 deletion examples/under-sampling/plot_nearmiss_2.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
from sklearn.datasets import make_classification
from sklearn.decomposition import PCA

from unbalanced_dataset.under_sampling import NearMiss
from imblearn.under_sampling import NearMiss

# Generate the dataset
X, y = make_classification(n_classes=2, class_sep=2, weights=[0.1, 0.9],
Expand Down
2 changes: 1 addition & 1 deletion examples/under-sampling/plot_nearmiss_3.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
from sklearn.datasets import make_classification
from sklearn.decomposition import PCA

from unbalanced_dataset.under_sampling import NearMiss
from imblearn.under_sampling import NearMiss

# Generate the dataset
X, y = make_classification(n_classes=2, class_sep=2, weights=[0.1, 0.9],
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
from sklearn.datasets import make_classification
from sklearn.decomposition import PCA

from unbalanced_dataset.under_sampling import NeighbourhoodCleaningRule
from imblearn.under_sampling import NeighbourhoodCleaningRule

# Generate the dataset
X, y = make_classification(n_classes=2, class_sep=2, weights=[0.1, 0.9],
Expand Down
2 changes: 1 addition & 1 deletion examples/under-sampling/plot_one_sided_selection.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
from sklearn.datasets import make_classification
from sklearn.decomposition import PCA

from unbalanced_dataset.under_sampling import OneSidedSelection
from imblearn.under_sampling import OneSidedSelection

# Generate the dataset
X, y = make_classification(n_classes=2, class_sep=2, weights=[0.1, 0.9],
Expand Down
Loading

0 comments on commit b6e015e

Please sign in to comment.