New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
OFDMModulator() returns a time-domain OFDM signal with size = None
(in training)
#7
Comments
Thank you for this bug report. Could you please provide a gist instead of a pdf, so that we can easily copy the code? Could you provide a code example that isolates the problem as much as possible, i.e., the shortest possible code snippet that allows us to reproduce the bug? |
Done! |
Hi, I think that there is an issue with you gist. The model in cell 6 does not seem to return anything. I have created an alternative gist, similar to yours, which works nicely: https://gist.github.com/jhoydis/0f09064c89accdb55c8135afcb1982e9 |
Hi,
(BTW, I did not add a return to the function since the gist v2: https://gist.github.com/kassankar/15842eaa59c4cbb969ad0aded2ff5613 |
I was able to reproduce the error in this minimal gist: https://gist.github.com/jhoydis/7720a7a2af236e5061d907e969084e85 |
Describe the bug
The class
OFDMModulator()
returns a time-domain OFDM signal withsize = None
while Training the Keras model. (only while training the model)To Reproduce
Please find a gist link : "https://gist.github.com/kassankar/70d62384d05b00fa8b9486ed862a4784" to reproduce the error.
The bug occurred while running block number
7
due toOFDMModulator()
in block number6
.Expected behavior
Based on the
OFDMModulator
description, the function should return an outputsize = num_ofdm_symbols*(fft_size+cyclic_prefix_length)
Screenshots
Additional context
The problem is directly connected to the function
flatten_last_dims()
used in theOFDMModulator()
class. More precisely, in the last two linesnew_shape = tf.concat([shape[:-num_dims], [-1]], 0)
,return tf.reshape(tensor, new_shape)
. I think the problem is occuring due to the use of[-1]
in thetf.reshape()
function. In the training the batch_size is equal toNone
, and based on the description of thetf.reshape()
function with an axis shape = [-1], the dimension is computed so that the total size remains constant. However, due to batch_size =None
, thetf.reshape()
return another 'None' dimension for the last output.Note:
tf.reshape()
with[-1]
is also used inOFDMDemodulator()
, innew_shape = tf.concat([tf.shape(inputs)[:-1], [-1], [self.fft_size + self.cyclic_prefix_length]], 0)
The text was updated successfully, but these errors were encountered: