Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Suppress warnings does not work #43

Open
CBrauer opened this issue Sep 13, 2021 · 3 comments
Open

Suppress warnings does not work #43

CBrauer opened this issue Sep 13, 2021 · 3 comments

Comments

@CBrauer
Copy link

CBrauer commented Sep 13, 2021

Please tell me how to suppress warnings in my Jupyter notebook.
I tried:

import os
os.environ['TF_CPP_MIN_LOG_LEVEL'] = '3'
import tensorflow as tf
import warnings
warnings.filterwarnings("ignore")

feature_names = ['BoxRatio', 'Thrust', 'Acceleration', 'Velocity', 'OnBalRun', 'vwapGain', 'Expect', 'Trin']
response_name = ['Altitude']

df_train = pd.read_csv("/HedgeTools/Datasets/rocket-train-classify.csv")

for name in feature_names:
    print('feature name: ', name)
    X_train = df_train[name].values
    f = Fitter(X_train)
    f.fit(progress=True)
    f.summary(plot=True)
    print('best transform: ', f.get_best())

and I'm getting

WARNING:root:SKIPPED alpha distribution (taking more than 30 seconds)
WARNING:root:SKIPPED beta distribution (taking more than 30 seconds)
WARNING:root:SKIPPED arcsine distribution (taking more than 30 seconds)
WARNING:root:SKIPPED anglit distribution (taking more than 30 seconds)
WARNING:root:SKIPPED argus distribution (taking more than 30 seconds)
WARNING:root:SKIPPED betaprime distribution (taking more than 30 seconds)
WARNING:root:SKIPPED bradford distribution (taking more than 30 seconds)
WARNING:root:SKIPPED burr distribution (taking more than 30 seconds)
WARNING:root:SKIPPED burr12 distribution (taking more than 30 seconds)
WARNING:root:SKIPPED chi2 distribution (taking more than 30 seconds)

Charles

@meetsaiya
Copy link

There's an attribute with which can increase the run time. It's bydefault set to 30. Kindly check documentation. I believe if you have big data set or its computationally intensive then you can change that. However, I believe it must be set some value because if the curve doesn't really fit it runs into an open ended diverging fit which make computers angry.

@NSGregory
Copy link

This can be addressed using the logging module (the messages are generated by the logging module). The error messages are generated from the "root" logger, which can be accessed by:

logging.getLogger()

You can change the threshold for the root logger by:

logging.getLogger().setLevel(logging.CRITICAL)

Where the parameter for setLevel has several values which can be read about in the logger manual. If you enter a string as a parameter for getLogger(), it will get the associated logger. It seems that fitter does not have a 'fitter' logger, but rather uses root, which is why that parameter is left blank in this example.

@ghayward
Copy link

For me it helped out first try raising the the timeout = ## argument in the Fitter() method. See if that works... but if it never converges, and just can't solve, then I'm less sure what to do.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants