Skip to content

Commit

Permalink
Merge pull request #32 from gdalle/patch-1
Browse files Browse the repository at this point in the history
Typos in docs
  • Loading branch information
rafaelbailo committed May 4, 2024
2 parents 278685e + 596ba7c commit 6f645fb
Show file tree
Hide file tree
Showing 5 changed files with 5 additions and 5 deletions.
2 changes: 1 addition & 1 deletion docs/src/distribution_sampling.md
Expand Up @@ -2,7 +2,7 @@

ConsensusBasedX.jl also provides [Consensus-Based Sampling](@ref).

The package exports `sample`, which behaves exactly as `minimise` in [Function minimisation](@ref). It assumes you have defined a function `f(x::AbstractVector)` that takes a single vector argumemt `x` of length `D = length(x)`.
The package exports `sample`, which behaves exactly as `minimise` in [Function minimisation](@ref). It assumes you have defined a function `f(x::AbstractVector)` that takes a single vector argument `x` of length `D = length(x)`.

For instance, if `D = 2`, you can sample `exp(-αf)` by running:
```julia
Expand Down
2 changes: 1 addition & 1 deletion docs/src/particle_initialisation.md
Expand Up @@ -10,7 +10,7 @@ If no options are provided, ConsensusBasedX.jl initialises its particles by samp

## Initial guess

If you have an initial guess for the global minimiser of the function `f`, you can pass the option `initial_guess` (or `initial_mean`). This can be a `Real`, if you want to use the same value for each coordinate of the initial guess, or an `AbstractVector` of size `size(initial_guess) = (D,)`. The particles will be initisalised by sampling a normal distribution with mean `initial_guess`/`initial_mean` and unit variance. [Full-code example](https://github.com/PdIPS/ConsensusBasedX.jl/blob/main/examples/basic_usage/initial_guess.jl).
If you have an initial guess for the global minimiser of the function `f`, you can pass the option `initial_guess` (or `initial_mean`). This can be a `Real`, if you want to use the same value for each coordinate of the initial guess, or an `AbstractVector` of size `size(initial_guess) = (D,)`. The particles will be initialised by sampling a normal distribution with mean `initial_guess`/`initial_mean` and unit variance. [Full-code example](https://github.com/PdIPS/ConsensusBasedX.jl/blob/main/examples/basic_usage/initial_guess.jl).


### Specify a normal distribution
Expand Down
2 changes: 1 addition & 1 deletion src/interface/maximise.jl
Expand Up @@ -4,7 +4,7 @@ maximise(f; keywords...)
```
```julia
maximise(config::NamedTuple, f)
maximise(f, config::NamedTuple)
```
Maximise the function `f` using Consensus-Based Optimisation.
Expand Down
2 changes: 1 addition & 1 deletion src/interface/minimise.jl
Expand Up @@ -9,7 +9,7 @@ minimise(f, config::NamedTuple)
Minimise the function `f` using Consensus-Based Optimisation (see [Function minimisation](@ref)).
You must specify the dimension `D` of the problem. Other paramters (e.g. the number of particles `N` or the number of ensembles `M` can also be specified; see [Summary of options](@ref).
You must specify the dimension `D` of the problem. Other parameters (e.g. the number of particles `N` or the number of ensembles `M`) can also be specified; see [Summary of options](@ref).
`minimize`, `optimise`, or `optimize` are aliases for `minimise`.
Expand Down
2 changes: 1 addition & 1 deletion src/interface/sample.jl
Expand Up @@ -9,7 +9,7 @@ sample(f, config::NamedTuple)
Sample the distribution `exp(-αf)` using Consensus-Based Sampling (see [Distribution sampling](@ref)).
You must specify the dimension `D` of the problem. Other paramters (e.g. the number of particles `N` or the number of ensembles `M` can also be specified; see [Summary of options](@ref).
You must specify the dimension `D` of the problem. Other paramters (e.g. the number of particles `N` or the number of ensembles `M`) can also be specified; see [Summary of options](@ref).
# Examples
Expand Down

0 comments on commit 6f645fb

Please sign in to comment.