Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Application keeps dying even after clearing Keras session #10911

Closed
spndo opened this issue Aug 15, 2018 · 1 comment
Closed

Application keeps dying even after clearing Keras session #10911

spndo opened this issue Aug 15, 2018 · 1 comment

Comments

@spndo
Copy link

spndo commented Aug 15, 2018

Hi,

I am a training a machine learning model that contains two fully connencted layers using Keras and Tensorflow. When performing an exhaustive search to tune some of the hyperparameters, I create a new model for every set of hyperparameters. Nevertheless, the training time for the models increases at each iteration. In order to fix this, I clear the Keras session every iteration using K.clear_session().

If I run my code using the CPU, the code runs just fine. However when attempting to use the GPU with Tensorflow backend, the application suddenyl crashes without any warnings or errors.

The following is the function that I use to create the new model.

def get_compiled_model(model_def, shape, model_type='ann'):

    K.clear_session()  #Clear the previous tensorflow graph
    #tf.reset_default_graph()
    
    #Shared parameters for the models
    optimizer = Adam(lr=0, beta_1=0.5)
    lossFunction = "mean_squared_error"
    metrics = ["mse"]
    model = None

    #Create and compile the models

    if model_type=='ann':
        model = model_def(shape)
        model.compile(optimizer = optimizer, loss = lossFunction, metrics = metrics)
    else:
        pass

    return model

models = {'shallow-20':RULmodel_SN_5}

This is the portion of the code that calls the function above

for dataset_number in max_window_size:
    
    tunable_model.data_handler.change_dataset(dataset_number)
    
    verbose = 1
    
    for r in range(90, 141):   #Load max_rul first as it forces reloading the dataset from file
        
        verbose = 2
        tunable_model.data_handler.max_rul = r
        
        for w in range(15, max_window_size[dataset_number]+1):
        
            for s in range(1,11):
                
                print("Testing for w:{}, s:{}, r:{}".format(w, s, r))
                
                #Set data parameters
                tunable_model.data_handler.sequence_length = w
                tunable_model.data_handler.sequence_stride = s

                #Create and compile the models
                shape = num_features*w
                **model = get_compiled_model(models['shallow-20'], shape, model_type='ann')**

                #Add model to tunable model
                tunable_model.change_model('ModelRUL_SN', model, 'keras')
                                
                #Load the data
                tunable_model.load_data(unroll=True, verbose=verbose, cross_validation_ratio=0)

@mikkokotila
Copy link

This seems to be a Windows thing. I believe there is no problem with Linux based systems.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants