Skip to content
the implementation of Power of Deep Learning for Channel Estimation and Signal Detection in OFDM Systems using keras
Branch: master
Clone or download
Latest commit 5deea12 Jun 16, 2019
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
Global_parameters.py add the pilot.txt Jun 6, 2019
Pilot_64 add the pilot.txt Jun 6, 2019
Pilot_8 add the pilot.txt Jun 6, 2019
README.md Update README.md Jun 16, 2019
generations.py init Jun 5, 2019
main.py

README.md

DNN_detection_via_keras

The implementation of Power of Deep Learning for Channel Estimation and Signal Detection in OFDM Systems using keras. I tried my best to simplify the codes, so that everyone can follow it easily. The original tensorflow version codes can be referred to here.

Requirement

tensorflow-gpu >= 1.12.0

data sets

I have uploaded the required data sets in BaiduYun Drive

password: 1234

As some readers mentioned, I also provided the download url for Google driver.

which are generated by saving the numpy arrays loaded from original provided .txt files.

Then, directly move the channel_train.npy and channel_test.npy to current file. Namely, the paths are './channel_train.npy' and './channel_test.npy'.

Original datasets is provided in https://github.com/haoyye/OFDM_DNN as txt.file, which may cost much time to load the data. Therefore, I save enough samples as the .npy files, so that the training sets can be loaded easily and also reduce the file size.

How to use

After downloaded and moved the data sets, just run main.py directly.

Some evaluation

Since this repo is just a reproduction, so I follow the original idea of the author: generate random init bits, simulate the channel by loading data from the .npy file, and then build the neuron network to recover bits from the received bits.

I know some readers want to directly apply the detection neuron network to replace their traditional receiver, for comparisons and so on. It is much easy to do with this codes. In brief, the codes for generated data is not needed. You can just save your original bits and receive signal of your own system as a .mat file (if you use Matlab) or .npy file. Then, load the data by Python and use the .fit function, where original_bits is the label and receiver signal is exactly the input of the network. You even do not need to simulate the channel (as you do it in your previous work and only receive signal is required).

Sorry for my English. If you have any problem, please contact me via my email. Hopefully it is helpful for you and if possible, star or fork this repo to support.

You can’t perform that action at this time.