Skip to content

An implementation of multilayer perceptron(MLP) on function approximation.

Notifications You must be signed in to change notification settings

isthatyoung/MLP-Function-Approximation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MLP-Function-Approximation

An implementation of multilayer perceptron(MLP) on function approximation.

Requirement

Python 3.7
torch 1.0.0
Numpy 1.15.2
Matplotlib 1.3.1

Description

This Python script implement the backpropagation algorithm for a 1 - S - 1 architecture network, S means the numbers of neurons in hidden layer. We use torch tensor to realize matrix operation within the forward and backward propogation. The weights and biases are randomly initialized and uniformly distributed between -0.5 and 0.5 (using the function rand).

Example approximation function

Result

enter image description here

Neurons Learning rate Epochs of training
100 0.01 1000

About

An implementation of multilayer perceptron(MLP) on function approximation.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages