Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ExponentiatedGradientReduction does not work for drop_prot_attr=True, needs one-line fix #267

Closed
jdnklau opened this issue Aug 31, 2021 · 0 comments · Fixed by #268
Closed

Comments

@jdnklau
Copy link
Contributor

jdnklau commented Aug 31, 2021

In the implementation for ExponentiatedGradientReduction the dropping of the protected attribute fails during prediction.

This is due to this line, which does not try to drop the protected attribute from the columns correctly.

It should rather be (cf. this line above from similar method)

X = X.drop(self.prot_attr, axis=1)

In the current implementation, initialising ExponentiatedGradientReduction with drop_prot_attr=True allows fitting the inprocessing model, but throws an exception at prediction time like this:

---------------------------------------------------------------------------
KeyError                                  Traceback (most recent call last)
<ipython-input-194-e05fe27a609c> in <module>
     12         eta_mul=2.0,
     13         drop_prot_attr=True) # Bug in AIF360 causes True to crash algorithm...
---> 14     train_inprocessed(clf, train_discriminated_protected, test_discriminated, test_ground_truth,
     15                       label=f"## Exponentiated Gradient Reduction, eps={eps}")
     16     print()

<ipython-input-182-0b8ee41d60dc> in train_inprocessed(inprocessing, train, test, construct_test, label)
      6 
      7 
----> 8     pred = inprocessing.predict(test)
      9 
     10     eval_fairness(pred)

~/.miniconda3/lib/python3.8/site-packages/aif360/algorithms/transformer.py in wrapper(self, *args, **kwargs)
     25     @wraps(func)
     26     def wrapper(self, *args, **kwargs):
---> 27         new_dataset = func(self, *args, **kwargs)
     28         if isinstance(new_dataset, Dataset):
     29             new_dataset.metadata = new_dataset.metadata.copy()

~/.miniconda3/lib/python3.8/site-packages/aif360/algorithms/inprocessing/exponentiated_gradient_reduction.py in predict(self, dataset)
    111         try:
    112             # Probability of favorable label
--> 113             scores = self.model.predict_proba(X_df)[:, fav]
    114             dataset_new.scores = scores.reshape(-1, 1)
    115         except (AttributeError, NotImplementedError):

~/.miniconda3/lib/python3.8/site-packages/aif360/sklearn/inprocessing/exponentiated_gradient_reduction.py in predict_proba(self, X)
    145         """
    146         if self.drop_prot_attr:
--> 147             X = X.drop(self.prot_attr)
    148 
    149         return self.model._pmf_predict(X)

~/.local/lib/python3.8/site-packages/pandas/core/frame.py in drop(self, labels, axis, index, columns, level, inplace, errors)
   4303                 weight  1.0     0.8
   4304         """
-> 4305         return super().drop(
   4306             labels=labels,
   4307             axis=axis,

~/.local/lib/python3.8/site-packages/pandas/core/generic.py in drop(self, labels, axis, index, columns, level, inplace, errors)
   4150         for axis, labels in axes.items():
   4151             if labels is not None:
-> 4152                 obj = obj._drop_axis(labels, axis, level=level, errors=errors)
   4153 
   4154         if inplace:

~/.local/lib/python3.8/site-packages/pandas/core/generic.py in _drop_axis(self, labels, axis, level, errors)
   4185                 new_axis = axis.drop(labels, level=level, errors=errors)
   4186             else:
-> 4187                 new_axis = axis.drop(labels, errors=errors)
   4188             result = self.reindex(**{axis_name: new_axis})
   4189 

~/.local/lib/python3.8/site-packages/pandas/core/indexes/base.py in drop(self, labels, errors)
   5589         if mask.any():
   5590             if errors != "ignore":
-> 5591                 raise KeyError(f"{labels[mask]} not found in axis")
   5592             indexer = indexer[~mask]
   5593         return self.delete(indexer)

KeyError: "['protected_attr'] not found in axis"

Kind regards

nrkarthikeyan added a commit that referenced this issue Sep 7, 2021
Fix exponential gradient reduction without protected attribute (#267)
dplecko pushed a commit to dplecko/AIF360 that referenced this issue Nov 5, 2021
Illia-Kryvoviaz pushed a commit to Illia-Kryvoviaz/AIF360 that referenced this issue Jun 7, 2023
Fix exponential gradient reduction without protected attribute (Trusted-AI#267)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant