Skip to content

cran/SBOAtools

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SBOAtools

SBOAtools is an R package developed for the Secretary Bird Optimization Algorithm (SBOA). The package supports both general-purpose continuous optimization and single-hidden-layer multilayer perceptron (MLP) training.

It is intended for researchers working in metaheuristic optimization, computational intelligence, and neural network training. The package allows users to apply SBOA either as a standalone optimizer or as a training algorithm for feed-forward neural networks.


Features

  • General-purpose continuous optimization with sboa()
  • Single-hidden-layer MLP training with sboa_mlp()
  • Prediction support via predict()
  • Convergence visualization via plot()
  • Model summaries via print()

Installation

During development, the package can be installed from the local source using:

devtools::install()

Then load the package with:

library(SBOAtools)

You can also install the development version from GitHub:

install.packages("remotes")
remotes::install_github("burakdilber/SBOAtools")

Main Functions

sboa()

Performs general-purpose continuous optimization using the Secretary Bird Optimization Algorithm.

sboa_mlp()

Trains a single-hidden-layer multilayer perceptron using the Secretary Bird Optimization Algorithm.


Example 1: General Optimization

library(SBOAtools)

sphere <- function(x) sum(x^2)

res <- sboa(
  fn = sphere,
  lower = rep(-10, 5),
  upper = rep(10, 5),
  n_agents = 10,
  max_iter = 20,
  seed = 123
)

print(res)
plot(res)

res$value
res$par

Example 2: MLP Training with SBOA

library(SBOAtools)

set.seed(123)

X_train <- matrix(runif(40), nrow = 10, ncol = 4)
y_train <- matrix(runif(10), nrow = 10, ncol = 1)

fit_mlp <- sboa_mlp(
  X_train = X_train,
  y_train = y_train,
  hidden_dim = 3,
  n_agents = 10,
  max_iter = 20,
  lower = -1,
  upper = 1,
  seed = 123
)

print(fit_mlp)
plot(fit_mlp)

pred <- predict(fit_mlp, X_train)
pred

Returned Objects

Output of sboa()

The sboa() function returns an object of class "sboa" containing:

  • par: best solution found
  • value: best objective function value
  • convergence: convergence curve over iterations
  • population: final population matrix
  • fitness: final fitness values of the population
  • call: matched function call

Output of sboa_mlp()

The sboa_mlp() function returns an object of class "sboa_mlp" containing:

  • par: optimized neural network parameters
  • value: best objective function value
  • convergence: convergence curve over iterations
  • input_dim: number of input variables
  • hidden_dim: number of hidden neurons
  • output_dim: number of output variables
  • x_min: minimum values used for input normalization
  • x_max: maximum values used for input normalization
  • y_min: minimum values used for output normalization
  • y_max: maximum values used for output normalization
  • fitted: fitted values on the original scale
  • metrics: training performance metrics
  • call: matched function call

Current Scope

The current version of the package supports:

  • continuous optimization problems
  • single-hidden-layer MLP models
  • sigmoid activation in hidden and output layers
  • regression-oriented neural network training with mean squared error (MSE)

Future Extensions

Possible future improvements include:

  • additional benchmark functions
  • engineering optimization examples
  • train/test evaluation helpers
  • classification support
  • alternative activation functions
  • multiple hidden layers
  • visualization utilities

Authors

  • Burak Dilber
  • A. Fırat Özdemir

References

License

MIT License

About

❗ This is a read-only mirror of the CRAN R package repository. SBOAtools — Secretary Bird Optimization for Continuous Optimization and Neural Networks. Homepage: https://github.com/burakdilber/SBOAtools Report bugs for this package: https://github.com/burakdilber/SBOAtools/issues

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors

Languages