Deep SHAP giving weird outliers #3372
Unanswered
Janerlend99
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi guys,
I've been using Deep SHAP in the SUMO-RL traffic environment, and I have encountered some unusual results.
This is the model wrapper I use on the PPO policy that is input to shap.DeepExplainer
These are the results I get:
![MicrosoftTeams-image](https://private-user-images.githubusercontent.com/59925053/279308859-a874fd55-250b-4486-bf16-ced580d907a9.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MTk3MzgzODYsIm5iZiI6MTcxOTczODA4NiwicGF0aCI6Ii81OTkyNTA1My8yNzkzMDg4NTktYTg3NGZkNTUtMjUwYi00NDg2LWJmMTYtY2VkNTgwZDkwN2E5LnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNDA2MzAlMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjQwNjMwVDA5MDEyNlomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPTZhZmZkMDdmMjVhZDk0ODkzYTEyYzM4ZTY5Y2I4YmY3NWIxNjU1NWFkOGNmMTQ4YTRkMGZkMDUxZmIxYTllMGImWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0JmFjdG9yX2lkPTAma2V5X2lkPTAmcmVwb19pZD0wIn0.3OVnWl4gBm4rN8JA9S0kHTe6D5mGQ5kMXfDdmK1MRbI)
Some instances look reasonable, while some instances look like this:
![MicrosoftTeams-image (2)](https://private-user-images.githubusercontent.com/59925053/279308847-58b1f685-df25-43ed-adf7-f43727088a5c.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MTk3MzgzODYsIm5iZiI6MTcxOTczODA4NiwicGF0aCI6Ii81OTkyNTA1My8yNzkzMDg4NDctNThiMWY2ODUtZGYyNS00M2VkLWFkZjctZjQzNzI3MDg4YTVjLnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNDA2MzAlMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjQwNjMwVDA5MDEyNlomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPWI2NzQ0ODlhNTMyMTUyN2U0NzVkYzJhZWI0YTRiNzNlMDY3MDE5MDk5MDYyYjM5MTBkNDg5ZDFkZGUyOThlYzEmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0JmFjdG9yX2lkPTAma2V5X2lkPTAmcmVwb19pZD0wIn0.ElhZelWyR9lWRiAsb8GmbGSyep5SKsl0OpmH4izx81o)
This is the results I get when not adding the soft_max layer or if I use KernelSHAP instead:
![shap_beeswarm](https://private-user-images.githubusercontent.com/59925053/279309414-1f2b1e3e-b48d-4667-8bed-5fc0a0befb3a.png?jwt=eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3MTk3MzgzODYsIm5iZiI6MTcxOTczODA4NiwicGF0aCI6Ii81OTkyNTA1My8yNzkzMDk0MTQtMWYyYjFlM2UtYjQ4ZC00NjY3LThiZWQtNWZjMGEwYmVmYjNhLnBuZz9YLUFtei1BbGdvcml0aG09QVdTNC1ITUFDLVNIQTI1NiZYLUFtei1DcmVkZW50aWFsPUFLSUFWQ09EWUxTQTUzUFFLNFpBJTJGMjAyNDA2MzAlMkZ1cy1lYXN0LTElMkZzMyUyRmF3czRfcmVxdWVzdCZYLUFtei1EYXRlPTIwMjQwNjMwVDA5MDEyNlomWC1BbXotRXhwaXJlcz0zMDAmWC1BbXotU2lnbmF0dXJlPTAzOTIzMWU2NjM5NDkyZDRiYTNmMGI5MTY1MTk5Njk3ZGZlY2E1YWJhNTcxMGZmMzIxZGUwMTI3OThlNjZmMDYmWC1BbXotU2lnbmVkSGVhZGVycz1ob3N0JmFjdG9yX2lkPTAma2V5X2lkPTAmcmVwb19pZD0wIn0.QiTwV6NjwrsD2wfpBSseP3pXP_FuxMR4xrsyKtvT6Ng)
Does anyone know the reason for why I get such large shap values for some features in a single instance when using soft max?
Also, in the start I used this wrapper but found out that DeepExplainer did not support the operations with x in the forward function. I mean, the code ran successfully but the results was definately not correct. Can anyone explain why I cant define the forward function in this way:
Beta Was this translation helpful? Give feedback.
All reactions