Skip to content

nuletizia/CORTICAL

Repository files navigation

CORTICAL

This repository contains the official Keras implementation of cooperative channel capacity learning (CORTICAL)

If you used the repository for your experiments, please cite the paper:

N. A. Letizia, A. M. Tonello and H. V. Poor, "Cooperative Channel Capacity Learning," in IEEE Communications Letters, vol. 27, no. 8, pp. 1984-1988, Aug. 2023, doi: 10.1109/LCOMM.2023.3282307.

The paper presents a cooperative framework (CORTICAL) to both estimate the channel capacity and sample from the capacity-achieving distribution using a combined generator/discriminator model. The official implementation is now available.

Example of capacity learning in the case of an AWGN channel (d=2) under peak-power constraint (P=10)

CORTICAL training commands

If you want to train your own CORTICAL model and compare its performance with our results

python CORTICAL.py

A variety of input arguments, e.g., type of channel or type of power constraint, is offered. Please check the arguments of CORTICAL.py for more details. Use the following command to include them

python CORTICAL.py --batch_size 512 --epochs 500 --test_size 10000 --dim 1 --channel 'AWGN' --power_constraint 'PP'

To modify the value of the power constraint, manually modify the functions defined outside the CORTICAL class.

Output is a series of .mat files. Every 1000 epochs a batch of generated input channels samples is saved. When the execution terminates, estimates of the channel capacity and samples from the optimal (if well trained) input distribution are provided.

The code has been tested on Python 3.6 with Tensorflow 1.15.2 and Keras 2.2.4. Please adjust libraries and dependencies based on your system.

CORTICAL training evolution

The following gifs show how CORTICAL learns the capacity-achieving distribution over time for different type of channels and power constraints.

About

Keras implementation of cooperative capacity learning (CORTICAL)

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages