Skip to content

cyl112233/Activtion

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Activtion

🔬 A Custom Activation Function for Deep Neural Networks

This project presents a novel activation function (P-SiLU) designed to enhance the training performance and feature learning capabilities of deep neural networks. It aims to address the limitations of traditional activations such as ReLU, Sigmoid.

📁 Project Structure

├── DataLoader/         # Modules for loading and preprocessing datasets
├── Save_Data/          # Directory to store training/testing results
├── read_data.py        # Script for initial data reading and preparation
├── test.py             # Script for standalone testing
├── test_run.py         # Main script for running training and evaluation
└── README.md           # Project documentation (this file)

⚙️ Key Features
Implementation of new custom activation functions
Plug-and-play integration with PyTorch models
Supports multiple CNN architectures (e.g., VGG16, ResNet50, DenseNet121)
Modular data loading, training, evaluation, and saving routines

🚀 How to Use
1️⃣ Clone the repository
bash
git clone https://github.com/cyl112233/Activtion.git
cd Activtion
2️⃣ Install dependencies
This project is built with Python 3.8+ and PyTorch >= 1.12.

Example:
pip install torch torchvision tqdm
3️⃣ Prepare datasets
Datasets used in experiments:

CIFAR10
CIFAR100
MNIST
FashionMNIST
TinyImageNet

Download scripts and dataloaders are in DataLoader/. By default, data will be saved under:
  /Y_L/fei/Activtion/Activtion/DataLoader/Data1
  You can change this path in test_run.py.

📌 Training Configuration
Below are the main hyperparameters used in experiments:

Parameter	Value
Optimizer	Adam
Learning Rate	1e-4
Weight Decay	0 (default)
Batch Size	64
Epochs	100
Activation Functions	ReLU, SiLU, EELU, P-SiLU.......
Data Augmentation	Random Horizontal Flip, Random Crop (for CIFAR/TinyImageNet)
Scheduler	None by default

Note: You can modify hyperparameters directly in test_run.py or model_buidiling/modle.py.

🎓 How to Run
Example command to train and test on CIFAR100 with EELU2 activation and VGG16:

python test_run.py
The script will:
   Load data using get_dataloader()
   Build the specified model architecture with the selected activation
   Save training logs and final results to Save_Data/
📊 Results
Each run logs:
Accuracy
Precision (macro average)
Recall (macro average)
Loss

Logs are saved as .txt files in Save_Data/.

📜 License
This project is open-source under the MIT License.

🤝 Contributing
Feel free to open Issues or Pull Requests for improvements, bug fixes, or new ideas!

📫 Contact
For questions related to this project, please reach out via GitHub Issues.

✅ Reproducibility Note
To address reproducibility:
    All hyperparameters and training settings are now explicitly documented above.
    Example configurations are provided in the main scripts.
    The same data splits and random seeds can be set for consistency.



About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages