New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Nonlinear regression using Keras #1874
Comments
It's hard to give a generic advice, also without knowing the specifics of your data. For example, is your data labelled -1, +1? Is it better if you try without a hiden layer first? etc. etc. - probably the google group is a better place for asking for advice like this. |
|
I think this question is more better suited for the keras google group here: The question isn't specifically a package related question. |
Unfortunately the answer is no. There is no magical tool that makes what you want. You can try:
|
@pasky @hlin117 Thanks for your advice. I will move my issues to google group later. @mrwns Thank you for your concern. I have format my input using API of sklearn and i was wondering did Keras provide any methods to format data? @philipperemy Really appreciate your answer, now i have make the regression result quite better, however there is still a problem that some negative numbers came out which is not expected. Is that related to the data-format? I scale them into [-1, 1] with mean of 0. How should I constraint the regression result to be all positive in this circumstance? Really thanks for all your help! The NN i created is as follows: `X_train_scale = preprocessing.scale(X_train) model = Sequential() |
@polarlight1994 yes it is somehow related to your inputs but you can also modify your model to handle it. I see two ways to fix your problem: First you can normalize your data in a different way: http://stats.stackexchange.com/a/70808 Secondly, if you want to stick with your current normalization, you might want to replace your final Activation layer from Linear to ReLU. https://en.wikipedia.org/wiki/Rectifier_(neural_networks) So you can replace your last layer by:
Finally if you want only (0,1) as output and no intermediary values like 0.123, you may have a look at the softmax layer (+argmax). This becomes a classification problem. |
@philipperemy I have tried the second way you mentioned before, but the output will all be 0. Since the expected output data in my problem is not constrained in (0, 1), so i was wondering the ReLU will not work because of this? Also i was told that if i want to make a nonlinear regression i should set the linear as output layer. Is that right? By the way, if i use the relu in the last second layer will it solve my problem? |
No using the linear activation layer as your final output in a nonlinear regression is not a prerequisite. It depends on where the values of your output data are. The ReLU will output values between (0, +infinity), the Sigmoid between (0,1) and the Linear between (-Infinity,+infinity). The Linear gives you negative values obviously. What is the interval of your expected data? |
@philipperemy My expected data should be located in range of (0, +infinity). So as you explained i should set the output layer into ReLU. But i will get all 0 output and the loss will not decrease in every epoch. Is it because the input is constrained in (-1, 1), so after the first three Sigmoid function the output of ReLU is always almost 0? |
If you get always 0 as output, it means that all the features at the previous layer are negative. I don't think the problem comes from your model or from your input data. You can always try to test with positive data but I don't think it will solve your problem. |
@philipperemy I followed your advice. Now the output is not always zero and the loss can decrease in every epoch, however, there are still lots of negative data in output. I can't figure out why...Here is my latest NN. I used the MinMaxScaler to format my input data in range of (0, 1). `min_max_scaler = preprocessing.MinMaxScaler() model = Sequential() model.compile(loss='mean_squared_error', optimizer='rmsprop')` |
It seems very weird that you still have negative data in your output. I tried a very simple example with negative and positive values in your XX_train and XX_test (before the MinMaxScaler between 0 and 1). My expected values were set to -1. I wanted to see that despite the ReLU layers, the NN could output negative values. If you expected this code, you will see that all the predicted values are 0. The ReLU layer prevents negative values.
Set the expected values to 1, and you will see all values very close to 1 (0.98,0.99, 1.01...). Once again the network could figure out this simple function. Values above 1 are consistent with the definition of the final ReLU layer.
Source code is here:
|
This issue has been automatically marked as stale because it has not had recent activity. It will be closed after 30 days if no further activity occurs, but feel free to re-open a closed issue if needed. |
I was trying to make nonlinear regression using Keras. However the result is far from satisfying. I was wondering how should I choose the Layers to build the NN and how to tuning the parameters like Activations, Objectives and others. Is there any principles or guide materials to address this problem? I am newcomer to deep learning and really need help here. The NN I built is as followes
model = Sequential()
model.add(Dense(input_dim = 4, output_dim = 500))
model.add(Activation('tanh'))
model.add(Dropout(0.5))
model.add(Dense(input_dim = 500, output_dim = 1))
model.add(Activation('tanh'))
model.compile(loss='mean_absolute_error', optimizer='rmsprop')
Thanks~`
The text was updated successfully, but these errors were encountered: