You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/src/optimization_packages/blackboxoptim.md
+2-2Lines changed: 2 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,5 +1,5 @@
1
1
# BlackBoxOptim.jl
2
-
[`BlackBoxOptim`](https://github.com/robertfeldt/BlackBoxOptim.jl) is a is a Julia package implementing **(Meta-)heuristic/stochastic algorithms** that do not require for the optimized function to be differentiable.
2
+
[`BlackBoxOptim`](https://github.com/robertfeldt/BlackBoxOptim.jl) is a Julia package implementing **(Meta-)heuristic/stochastic algorithms** that do not require for the optimized function to be differentiable.
3
3
4
4
## Installation: OptimizationBBO.jl
5
5
@@ -46,7 +46,7 @@ The currently available algorithms are listed [here](https://github.com/robertfe
46
46
47
47
## Example
48
48
49
-
The Rosenbrock function can optimized using the `BBO_adaptive_de_rand_1_bin_radiuslimited()` as follows:
49
+
The Rosenbrock function can be optimized using the `BBO_adaptive_de_rand_1_bin_radiuslimited()` as follows:
Copy file name to clipboardExpand all lines: docs/src/optimization_packages/gcmaes.md
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,5 +1,5 @@
1
1
# GCMAES.jl
2
-
[`GCMAES`](https://github.com/AStupidBear/GCMAES.jl) is a Julia package implementing the **Gradient-based Covariance Matrix Adaptation Evolutionary Strategy** which can utilize the gradient information to speed up the optimization process.
2
+
[`GCMAES`](https://github.com/AStupidBear/GCMAES.jl) is a Julia package implementing the **Gradient-based Covariance Matrix Adaptation Evolutionary Strategy**, which can utilize the gradient information to speed up the optimization process.
3
3
4
4
## Installation: OptimizationGCMAES.jl
5
5
@@ -19,7 +19,7 @@ constraint equations. However, lower and upper constraints set by `lb` and `ub`
19
19
20
20
## Example
21
21
22
-
The Rosenbrock function can optimized using the `GCMAESOpt()` without utilizing the gradient information as follows:
22
+
The Rosenbrock function can be optimized using the `GCMAESOpt()` without utilizing the gradient information as follows:
Copy file name to clipboardExpand all lines: docs/src/optimization_packages/metaheuristics.md
+7-7Lines changed: 7 additions & 7 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,5 +1,5 @@
1
1
# Metaheuristics.jl
2
-
[`Metaheuristics`](https://github.com/jmejia8/Metaheuristics.jl) is a is a Julia package implementing **metaheuristic algorithms** for global optiimization that do not require for the optimized function to be differentiable.
2
+
[`Metaheuristics`](https://github.com/jmejia8/Metaheuristics.jl) is a Julia package implementing **metaheuristic algorithms** for global optimization that does not require for the optimized function to be differentiable.
A `Metaheuristics` Single-Objective algorithm is called using one of the following:
16
16
17
17
* Evolutionary Centers Algorithm: `ECA()`
18
-
* Differential Evolution: `DE()` with 5 different stratgies
18
+
* Differential Evolution: `DE()` with 5 different strategies
19
19
-`DE(strategy=:rand1)` - default strategy
20
20
-`DE(strategy=:rand2)`
21
21
-`DE(strategy=:best1)`
@@ -27,13 +27,13 @@ A `Metaheuristics` Single-Objective algorithm is called using one of the followi
27
27
* Simulated Annealing: `SA()`
28
28
* Whale Optimization Algorithm: `WOA()`
29
29
30
-
`Metaheuristics` also performs [`Multiobjective optimization`](https://jmejia8.github.io/Metaheuristics.jl/stable/examples/#Multiobjective-Optimization) but this is not yet supported by `Optimization`.
30
+
`Metaheuristics` also performs [`Multiobjective optimization`](https://jmejia8.github.io/Metaheuristics.jl/stable/examples/#Multiobjective-Optimization), but this is not yet supported by `Optimization`.
31
31
32
-
Each optimizer sets default settings based on the optimization problem but specific parameters can be set as shown in the original [`Documentation`](https://jmejia8.github.io/Metaheuristics.jl/stable/algorithms/)
32
+
Each optimizer sets default settings based on the optimization problem, but specific parameters can be set as shown in the original [`Documentation`](https://jmejia8.github.io/Metaheuristics.jl/stable/algorithms/)
33
33
34
-
Additionally, `Metaheuristics` common settings which would be defined by [`Metaheuristics.Options`](https://jmejia8.github.io/Metaheuristics.jl/stable/api/#Metaheuristics.Options) can be simply passed as special keywoard arguments to `solve` without the need to use the `Metaheuristics.Options` struct.
34
+
Additionally, `Metaheuristics` common settings which would be defined by [`Metaheuristics.Options`](https://jmejia8.github.io/Metaheuristics.jl/stable/api/#Metaheuristics.Options) can be simply passed as special keyword arguments to `solve` without the need to use the `Metaheuristics.Options` struct.
35
35
36
-
Lastly, information about the optimization problem such as the true optimum is set via [`Metaheuristics.Information`](https://jmejia8.github.io/Metaheuristics.jl/stable/api/#Metaheuristics.Information) and passed as part of the optimizer struct to `solve` e.g. `solve(prob, ECA(information=Metaheuristics.Inoformation(f_optimum = 0.0)))`
36
+
Lastly, information about the optimization problem such as the true optimum is set via [`Metaheuristics.Information`](https://jmejia8.github.io/Metaheuristics.jl/stable/api/#Metaheuristics.Information) and passed as part of the optimizer struct to `solve` e.g.,`solve(prob, ECA(information=Metaheuristics.Inoformation(f_optimum = 0.0)))`
37
37
38
38
39
39
@@ -46,7 +46,7 @@ constraint equations. However, lower and upper constraints set by `lb` and `ub`
46
46
47
47
## Examples
48
48
49
-
The Rosenbrock function can optimized using the Evolutionary Centers Algorithm `ECA()` as follows:
49
+
The Rosenbrock function can be optimized using the Evolutionary Centers Algorithm `ECA()` as follows:
Copy file name to clipboardExpand all lines: docs/src/optimization_packages/multistartoptimization.md
+4-4Lines changed: 4 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -1,5 +1,5 @@
1
1
# MultiStartOptimization.jl
2
-
[`MultistartOptimization`](https://github.com/tpapp/MultistartOptimization.jl) is a is a Julia package implementing a global optimization multistart method which performs local optimization after choosing multiple starting points.
2
+
[`MultistartOptimization`](https://github.com/tpapp/MultistartOptimization.jl) is a Julia package implementing a global optimization multistart method which performs local optimization after choosing multiple starting points.
3
3
4
4
`MultistartOptimization` requires both a global and local method to be defined. The global multistart method chooses a set of initial starting points from where local the local method starts from.
You also need to load the relevant subpackage for the local method of you choice, for example if you plan to use one of the NLopt.jl's optimizers, you'd install and load OptimizationNLopt as described in the [NLopt.jl](@ref)'s section.
17
+
You also need to load the relevant subpackage for the local method of your choice, for example if you plan to use one of the NLopt.jl's optimizers, you'd install and load OptimizationNLopt as described in the [NLopt.jl](@ref)'s section.
18
18
19
19
## Global Optimizer
20
20
### Without Constraint Equations
@@ -24,7 +24,7 @@ constraint equations. However, lower and upper constraints set by `lb` and `ub`
24
24
25
25
## Examples
26
26
27
-
The Rosenbrock function can optimized using `MultistartOptimization.TikTak()` with 100 initial points and the local method `NLopt.LD_LBFGS()` as follows:
27
+
The Rosenbrock function can be optimized using `MultistartOptimization.TikTak()` with 100 initial points and the local method `NLopt.LD_LBFGS()` as follows:
28
28
29
29
```@example MultiStart
30
30
using Optimization, OptimizationMultistartOptimization, OptimizationNLopt
sol = solve(prob, MultistartOptimization.TikTak(100), NLopt.LD_LBFGS())
37
37
```
38
38
39
-
You can use any `Optimization` optimizers you like. The global method of the `MultistartOptimization` is a positional argument and followed by the local method. This for example means we can perform a multistartoptimization with LBFGS as the optimizer using either the `NLopt.jl` or `Optim.jl` implementation as follows. Moreover, this interface allows you access and adjust all the optimizer settings as you normally would:
39
+
You can use any `Optimization` optimizers you like. The global method of the `MultistartOptimization` is a positional argument and followed by the local method. For example, we can perform a multistartoptimization with LBFGS as the optimizer using either the `NLopt.jl` or `Optim.jl` implementation as follows. Moreover, this interface allows you to access and adjust all the optimizer settings as you normally would:
0 commit comments