Skip to content

ozermehmett/NeuroFox

Repository files navigation

NeuroFox

Logo

Note: For the Turkish version of this document, refer to README_TR.md.

πŸ“„ Table of Contents

  1. Project Summary
  2. Project File Structure
  3. Data Generation Functions
  4. Neural Network Layers
  5. Activation Functions
  6. Regularization Layers
  7. Dense Layer
  8. Loss Functions
  9. Neural Network Class
  10. Optimizers
  11. Learning Rate Scheduler
  12. Utilities
  13. Example Usage

πŸ“‚ Project Summary

NeuroFox is a neural network application that includes various neural network components and optimization techniques. It performs performance analyses using binary classification and various activation functions.

πŸ“‚ Project File Structure

NeuroFox/
β”‚
β”œβ”€β”€ assets/
β”‚   β”œβ”€β”€ linear_activation.png         # Graph of the linear activation function
β”‚   β”œβ”€β”€ logo.png                      # Project logo
β”‚   β”œβ”€β”€ relu_activation.png           # Graph of the ReLU activation function
β”‚   β”œβ”€β”€ sigmoid_activation.png        # Graph of the sigmoid activation function
β”‚   └── softmax_activation.jpg        # Graph of the softmax activation function
β”‚
β”œβ”€β”€ data/
β”‚   β”œβ”€β”€ __init__.py                   # File defining the data module
β”‚   └── data.py                       # File containing data generation functions
β”‚
β”œβ”€β”€ layers/
β”‚   β”œβ”€β”€ __init__.py                   # File defining the layers module
β”‚   β”œβ”€β”€ dense.py                      # File containing the Dense layer class
β”‚   β”œβ”€β”€ dropout.py                    # File containing the Dropout regularization layer
β”‚   β”œβ”€β”€ layer.py                      # File containing the base layer class
β”‚   └── activations/                  # Activation functions
β”‚       β”œβ”€β”€ __init__.py               # File defining the activations module
β”‚       β”œβ”€β”€ linear.py                 # File containing the linear activation function
β”‚       β”œβ”€β”€ relu.py                   # File containing the ReLU activation function
β”‚       β”œβ”€β”€ sigmoid.py                # File containing the sigmoid activation function
β”‚       └── softmax.py                # File containing the softmax activation function
β”‚
β”œβ”€β”€ losses/
β”‚   β”œβ”€β”€ __init__.py                   # File defining the losses module
β”‚   β”œβ”€β”€ binary_crossentropy.py        # File containing the binary cross-entropy loss function
β”‚   β”œβ”€β”€ binary_focal_loss.py          # File containing the binary focal loss function
β”‚   └── categorical_crossentropy.py   # File containing the categorical cross-entropy loss function
β”‚
β”œβ”€β”€ neural_network/
β”‚   β”œβ”€β”€ __init__.py                   # File defining the neural_network module
β”‚   └── neural_network.py             # File defining the neural network structure
β”‚
β”œβ”€β”€ optimizers/
β”‚   β”œβ”€β”€ __init__.py                   # File defining the optimizers module
β”‚   β”œβ”€β”€ adagrad_optimizer.py          # File containing the Adagrad optimization algorithm
β”‚   β”œβ”€β”€ adam_optimizer.py             # File containing the Adam optimization algorithm
β”‚   β”œβ”€β”€ learning_rate_scheduler.py    # File containing the learning rate scheduler
β”‚   β”œβ”€β”€ rmsprop_optimizer.py          # File containing the RMSprop optimization algorithm
β”‚   └── sgd_optimizer.py              # File containing the Stochastic Gradient Descent (SGD) optimization algorithm
β”‚
β”œβ”€β”€ utils/
β”‚   β”œβ”€β”€ __init__.py                   # File defining the utils module
β”‚   β”œβ”€β”€ binary_classification.py      # Tools for generating binary classification data
β”‚   β”œβ”€β”€ model_utils.py                # Various utility functions related to models
β”‚   β”œβ”€β”€ one_hot_encoding.py           # One-hot encoding function
β”‚   β”œβ”€β”€ standard_scaler.py            # Function for standardizing data
β”‚   └── train_test_split.py           # Function for splitting data into training and testing sets
β”‚
β”œβ”€β”€ binary_classification_model.py    # Example of a binary classification model
β”œβ”€β”€ iris_dataset_model.py             # Example model with the IRIS dataset
β”œβ”€β”€ xor_model.py                      # Example model with the XOR dataset
└── README.md                         # General information about the project, installation, and usage instructions

File Descriptions

  • assets/: Visual files related to the project, including activation function graphs.
  • data/: Functions for generating and testing training data.
  • layers/: Neural network layers and activation functions.
  • losses/: Loss functions and their implementations.
  • neural_network/: Building blocks of the neural network model.
  • optimizers/: Optimization algorithms and learning rate schedulers.
  • utils/: Functions for data processing, model management, and other utilities.
  • README.md: General project information, installation instructions, usage details, and examples.

πŸ”§ Data Generation Functions

create_xor_data(num_samples)

Generates XOR data for binary classification tasks.

  • Usage:

    X, y = create_xor_data(1000)
  • Parameters:

    • num_samples (int): Number of data points to generate.
  • Returns:

    • X: Input features
    • y: Labels

create_binary_classification_data(num_samples=1000)

Generates binary classification data with an option to add noise.

  • Usage:

    X, y = create_binary_classification_data(samples=1000, noise=0.1)
  • Parameters:

    • num_samples (int): Number of data points to generate.
  • Returns:

    • X: Input features
    • y: Labels

🧩 Neural Network Layers

Layer

Base class for all layers within the neural network.

βš™οΈ Activation Functions

ActivationSoftmax

Applies the Softmax activation function to the input.

  • Softmax Formula:

    • $$\text{Softmax}(z_i) = \frac{e^{z_i}}{\sum_{j=1}^K e^{z_j}}$$
  • Softmax

ActivationSigmoid

Applies the Sigmoid activation function to the input.

  • Sigmoid Formula:

    • $$\sigma(x) = \frac{1}{1 + e^{-x}}$$
  • Sigmoid

ActivationReLU

Applies the ReLU activation function to the input.

  • ReLU Formula:

    • $$\text{ReLU}(x) = \max(0, x)$$
  • ReLU

ActivationLinear

Applies the linear activation function to the input (no change).

  • Linear Formula:

    • $$f(x) = x$$
  • Linear

πŸ”„ Regularization Layers

Dropout(rate=0.5)

Applies dropout regularization.

  • Usage:

    dropout_layer = Dropout(rate=0.5)
  • Parameters:

    • rate (float): The proportion of input units to drop.

πŸ”’ Dense Layer

Dense(input_size, output_size)

A fully connected layer in the neural network.

  • Usage:

    dense_layer = Dense(input_size=128, output_size=64)
  • Parameters:

    • input_size (int): Number of input features.
    • output_size (int): Number of output features.

πŸ“‰ Loss Functions

BinaryCrossentropy

Calculates binary cross-entropy loss.

  • Formula:
    • $$L = -\frac{1}{N}\sum_{i=1}^{N} [y_i \log(\hat{y}_i) + (1-y_i) \log(1-\hat{y}_i)]$$

CategoricalCrossentropy

Calculates categorical cross-entropy loss.

  • Formula:
    • $$L = -\sum_{i=1}^{N} y_i \log(\hat{y}_i)$$

BinaryFocalLoss(gamma=2, alpha=0.25)

Calculates binary focal loss, often used to address class imbalance.

  • Formula:

    • $$\text{FL}(p_t) = -\alpha_t (1 - p_t)^\gamma \log(p_t)$$
  • Parameters

:

  • gamma (float): Focusing parameter.
  • alpha (float): Balancing factor.

πŸ”§ Neural Network Class

NeuralNetwork

The main class for defining and training neural networks.

  • Usage:
    nn = NeuralNetwork()
    nn.add(Dense(128, 64))
    nn.add(ActivationReLU())
    nn.compile(loss=BinaryCrossentropy(), optimizer=AdamOptimizer(learning_rate=0.001))
    nn.train(X_train, y_train, epochs=10, batch_size=32)

βš™οΈ Optimizers

AdamOptimizer(learning_rate=0.001)

The Adam optimization algorithm.

  • Usage:

    optimizer = AdamOptimizer(learning_rate=0.001)
  • Parameters:

    • learning_rate (float): Learning rate for the optimizer.

SGDOptimizer(learning_rate=0.01)

Stochastic Gradient Descent optimizer.

  • Usage:

    optimizer = SGDOptimizer(learning_rate=0.01)
  • Parameters:

    • learning_rate (float): Learning rate for the optimizer.

πŸ“ˆ Learning Rate Scheduler

LearningRateScheduler

Adjusts the learning rate during training.

  • Usage:
    scheduler = LearningRateScheduler(schedule=lambda epoch: 0.001 * 0.95 ** epoch)

πŸ› οΈ Utilities

train_test_split(X, y, test_size=0.2)

Splits data into training and testing sets.

  • Usage:

    X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2)
  • Parameters:

    • X (array): Features.
    • y (array): Labels.
    • test_size (float): Proportion of the dataset to include in the test split.

πŸ“š Example Usage

Binary Classification Example

from neural_network import NeuralNetwork
from layers import Dense, ActivationReLU
from losses import BinaryCrossentropy
from optimizers import AdamOptimizer
from utils import create_binary_classification_data, train_test_split

# Generate data
X, y = create_binary_classification_data()

# Split data
X_train, X_test, y_train, y_test = train_test_split(X, y)

# Create and train model
model = NeuralNetwork()
model.add_layer(Dense(input_size=2, output_size=8))
model.add_layer(ActivationReLU())
model.add_layer(Dense(input_size=8, output_size=1))
model.compile(loss=BinaryCrossentropy(), optimizer=AdamOptimizer())
model.fit(X_train, y_train, epochs=10)

# Evaluate model
accuracy = model.evaluate(X_test, y_test)
print(f"Accuracy: {accuracy * 100:.2f}%")

IRIS Dataset Example

from neural_network import NeuralNetwork
from layers import Dense, ActivationSoftmax
from losses import CategoricalCrossentropy
from optimizers import AdamOptimizer
from utils import load_iris_data, train_test_split

# Load data
X, y = load_iris_data()

# Split data
X_train, X_test, y_train, y_test = train_test_split(X, y)

# Create and train model
model = NeuralNetwork()
model.add_layer(Dense(input_size=4, output_size=32))
model.add_layer(ActivationReLU())
model.add_layer(Dense(input_size=32, output_size=3))
model.add_layer(ActivationSoftmax())
model.compile(loss=CategoricalCrossentropy(), optimizer=AdamOptimizer())
model.fit(X_train, y_train, epochs=10)

# Evaluate model
accuracy = model.evaluate(X_test, y_test)
print(f"Accuracy: {accuracy * 100:.2f}%")

XOR Dataset Example

from neural_network import NeuralNetwork
from layers import Dense, ActivationReLU
from losses import BinaryCrossentropy
from optimizers import AdamOptimizer
from utils import create_xor_data, train_test_split

# Generate data
X, y = create_xor_data(1000)

# Split data
X_train, X_test, y_train, y_test = train_test_split(X, y)

# Create and train model
model = NeuralNetwork()
model.add_layer(Dense(input_size=2, output_size=8))
model.add_layer(ActivationReLU())
model.add_layer(Dense(input_size=8, output_size=1))
model.compile(loss=BinaryCrossentropy(), optimizer=AdamOptimizer())
model.fit(X_train, y_train, epochs=10)

# Evaluate model
accuracy = model.evaluate(X_test, y_test)
print(f"Accuracy: {accuracy * 100:.2f}%")

About

Neural Network Framework with NumPy

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages