Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Melspectrogram cant be set 'trainable_fb=False' #67

Closed
zhh6 opened this issue Apr 12, 2020 · 8 comments
Closed

Melspectrogram cant be set 'trainable_fb=False' #67

zhh6 opened this issue Apr 12, 2020 · 8 comments

Comments

@zhh6
Copy link

zhh6 commented Apr 12, 2020

Melspectrogram cant be set 'trainable_fb=False',after I set trainable_fb=False,trainable_kernel=False,but seems like it doesnot work.It is still trainable.

@Toku11
Copy link

Toku11 commented May 17, 2020

It's a problem with tf.keras the first solution that comes to mind is to declare MelSpectrogram and set layers as not trainable

model = Sequential()
model.add(Melspectrogram(sr=SR, n_mels=128, 
          n_dft=512, n_hop=128, input_shape=input_shape, 
          return_decibel_melgram=True,
          trainable_kernel=False, name='melgram',trainable_fb=False))
for layer in model.layers:
    layer.trainable = False

you can verify with
model.non_trainable_weights
Output:

[<tf.Variable 'melgram/real_kernels:0' shape=(512, 1, 1, 257) dtype=float32>,
<tf.Variable 'melgram/imag_kernels:0' shape=(512, 1, 1, 257) dtype=float32>,
<tf.Variable 'melgram/Variable:0' shape=(257, 128) dtype=float32>]

First two are part of Spectrogram Class [Kernel] and Variable:0 is [fb] due author forgot to set a name for it. you should be able to set as non trainable any of them separately

@keunwoochoi
Copy link
Owner

@zhh6 Could you provide a test code that checks it like we have in tests? I can't spend a lot of time updating Kapre these days but can try something small. I guess it'd be fixed if we follow the new guide, i.e., something like

self.b = self.add_weight(shape=(units,),
                             initializer='zeros',
                             trainable=True)

@zhh6
Copy link
Author

zhh6 commented May 18, 2020

Thanks a lot. I have solved the problem.Just like you do that.
m = Melspectrogram(sr=SR, n_mels=128, n_dft=512, n_hop=128, input_shape=input_shape, return_decibel_melgram=True, trainable_kernel=False, name='melgram',trainable_fb=False) m.trainble = False

@Toku11
Copy link

Toku11 commented May 18, 2020

Why not to use only tf.Variable(..,trainable=False) however my question is ...@keunwoochoi why did you decide to set those variables as "weights" trainable or not? Shouldn't they be fixed?

@keunwoochoi
Copy link
Owner

That's because one might want to set it trainable to optimize the time-frequency representation.

@keunwoochoi
Copy link
Owner

This would be one of the things I'd like to fix wrt tf2.0 as well as using tf.stft. I can also quickly update the code to use self.add_weight(), but that should come with stop supporting tf 1.0 (which I'm fine with).

@Toku11
Copy link

Toku11 commented May 18, 2020

I'm trying to update using tf.stft ... however I think it's going to be impossible set kernels as trainable in that scenario... In fact Spectrogram class would be very short... Do you have any idea or recommendation to that update?

@keunwoochoi
Copy link
Owner

that'd be great, thanks! and you're right - with tf.stft, one can't it trainable.

i just made https://github.com/keunwoochoi/kapre/tree/kapre-0.2. this is to aggregate changes towards kapre 0.2 which would drop tf 1.0 completely. so, for that kind of change, please make a PR to this branch and i'll later merge all of them to master at once.
besides, the PR should be just typical - with some unit tests, which we can now compare with librosa.stft :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants