Skip to content

Gradient optimization method using exponential damping and second-order discrete derivative for neural networks and multidimensional real functions

License

Notifications You must be signed in to change notification settings

NekkittAY/MAMGD_Optimizer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MAMGD_Optimizer

Gradient optimization method using exponential damping and second-order discrete derivative for neural networks and multidimensional real functions

Gradient-based Optimization Method for Multidimensional Real Functions and Neural Networks

This project focuses on the development of a gradient-based optimization method for multidimensional real functions and neural networks using exponential decay with Tensorflow and Keras.

Technologies Used

  • Tensorflow
  • Keras
  • Matplotlib
  • NumPy

Features

  • Optimization of multidimensional real functions
  • Optimization of neural networks
  • Utilizes exponential decay for optimization
  • Integration with Tensorflow and Keras for seamless implementation

About

Gradient optimization method using exponential damping and second-order discrete derivative for neural networks and multidimensional real functions

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published