Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Switch from SmartPredictor to SmartExplainer #111

Closed
chetanambi opened this issue Feb 3, 2021 · 4 comments · Fixed by #146
Closed

Switch from SmartPredictor to SmartExplainer #111

chetanambi opened this issue Feb 3, 2021 · 4 comments · Fixed by #146
Assignees
Labels
enhancement New feature or request shapash 1.2.0

Comments

@chetanambi
Copy link

Hi Team,

Is it possible to generate local explanations on the new data added to SmartPredictor object? I went through all the tutorials and I understand that when we are satisfied with the explainability results given by Shapash, we can use the SmartPredictor object for deployment. But my question is how can we get local explanation chart after deployment on the new data that is coming in? I hope my question is clear.

Thanks,
Chetan

@johannmartin95
Copy link
Contributor

Hello @chetanambi,

Thanks again for your feedbacks. Glad to see that you've taken the solution in hand.

#To answer your question :
The local explanation chart is only available through the SmartExplainer Object for now. However, we just discussed about your question with the team. The need behind your question seems like a good idea. We are currently thinking about the best ways to integrate functionalities that will allow us to answer it. We will try to keep you informed about this.

If you are interested, you can also contribute by forking this repository and follow the steps described in the contributing.md's file available in the repository

Thanks again for your contributions.

Hoping that this answer will suit you for now.

The Team Shapash.

@johannmartin95
Copy link
Contributor

Description of Problem:
In deployment, we are using the SmartPredictor object to get predictions and explainability of our models on specific datasets. The issue here is about the possibility to generate charts to visualize our results after deployment on this new data.

Overview of the Solution:

  • add a new method on SmartPredictor object "to_smartexplainer" to switch from a SmartPredictor Object to a SmartExpxlainer one
    • Start with a check to ensure that the add_input step has been done with the SmartPredictor to get the specific data on which we want to analyse several charts
    • Initialize SmartExplainer Object with attributes from SmartPredictor
    • Compute the compile step of SmartExplainer with attributes and input used by the SmartPredictor for parameters

Examples:
image
image
image
image
image
image

@johannmartin95 johannmartin95 changed the title Local explanations on SmartPredictor Switch from SmartPredictor to SmartExplainer Feb 24, 2021
@johannmartin95 johannmartin95 self-assigned this Feb 24, 2021
@johannmartin95 johannmartin95 added the enhancement New feature or request label Feb 24, 2021
@chetanambi
Copy link
Author

@JohannMartin When this feature will be available for use?

@johannmartin95
Copy link
Contributor

Hello @chetanambi,

We are currently working on a new release of Shapash and this method will be included. We've planned it for next week.
We'll keep you informed if there's any changes.

Thanks again for your contributions.

The Team Shapash.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request shapash 1.2.0
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants