Skip to content

Replication a series of networks completely training and inferencing over spectral/frequency domain

Notifications You must be signed in to change notification settings

dixiyao/SpecNet

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SpecNet

Replication a series of networks completely training and inferencing over spectral/frequency domain. Implemented in Pytorch.

Inplementation Detail

Frequency Domain Operations

In this part, I mainly refer to following works:

Bochen Guan, Jinnian Zhang, William A. Sethares, Richard Kijowski, Fang Liu.
SPECTRAL DOMAIN CONVOLUTIONAL NEURAL NETWORK
2021 ICASSP.

Oren Rippel, Jasper Snoek, Ryan P. Adams.
Spectral Representations for Convolutional Neural Networks
2015 NIPS.

The current implementation supports following operations in frequency domain

  • FFT layer & IFFT layer
  • Spectral Convolution (Elementwise production)
  • Spectral Pooling
  • Spectral Activation (Tanh)
  • Spectral Normalization

Efficiency and enery saving

Current inplementation only includes once FFT and once IFFT in the whole structure. However, this is still not efficiently and memory saving inplemented.
First, the inplmentation of spectral kernel is relized by such: saving in the spatial domain and using FFT to transform it into frequency domain and do the forwarding. This results in overheads of doing FFT over conv kernel.
Second, the parameter is also not memory saving. Because of the conjactory feature of spectral graph, we do not need to save $n^2$ complex numbers instead of $O(n^2)$ space complexity.
As a result, in this part, I mainly refer to

Jong Hwan Ko, Burhan Mudassar, Taesik Na, and Saibal Mukhopadhyay.
Design of an Energy-Efficient Accelerator for Training of Convolutional Neural Networks using Frequency-Domain Computation
2017 DAC.

Todo

  • Sinc Interpolation
  • Hermitian Symmetry

Result

Currently, over MNIST can achieve close convergence to spatial network, but costing much more memory and latency.
Hyperparameter: Both Adam, LR=0.001, batch size 512

MNIST_vanilia

About

Replication a series of networks completely training and inferencing over spectral/frequency domain

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages