Skip to content

Commit 92f75c9

Browse files
Merge pull request #450 from ArnoStrouwen/LT
[skip ci] LanguageTool
2 parents 0283386 + 56dc842 commit 92f75c9

21 files changed

+92
-84
lines changed

docs/pages.jl

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -20,10 +20,10 @@ pages = ["index.md",
2020
"Flux.jl" => "optimization_packages/flux.md",
2121
"GCMAES.jl" => "optimization_packages/gcmaes.md",
2222
"MathOptInterface.jl" => "optimization_packages/mathoptinterface.md",
23-
"MultistartOptimization.jl" => "optimization_packages/multistartoptimization.md",
2423
"Metaheuristics.jl" => "optimization_packages/metaheuristics.md",
25-
"NOMAD.jl" => "optimization_packages/nomad.md",
24+
"MultistartOptimization.jl" => "optimization_packages/multistartoptimization.md",
2625
"NLopt.jl" => "optimization_packages/nlopt.md",
26+
"NOMAD.jl" => "optimization_packages/nomad.md",
2727
"Nonconvex.jl" => "optimization_packages/nonconvex.md",
2828
"Optim.jl" => "optimization_packages/optim.md",
2929
"Optimisers.jl" => "optimization_packages/optimisers.md",

docs/src/API/modelingtoolkit.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -14,6 +14,6 @@ details.
1414

1515
Secondly, one can generate `OptimizationProblem`s for use in
1616
Optimization.jl from purely a symbolic front-end. This is the form
17-
users will encounter when using ModelingToolkit.jl directly, and its
17+
users will encounter when using ModelingToolkit.jl directly, and it is
1818
also the form supplied by domain-specific languages. For more information,
1919
see the [OptimizationSystem documentation](https://docs.sciml.ai/ModelingToolkit/stable/systems/OptimizationSystem/).

docs/src/index.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,11 @@
11
# Optimization.jl: A Unified Optimization Package
22

33
Optimization.jl is a package with a scope that is beyond your normal global optimization
4-
package. Optimization.jl seeks to bring together all of the optimization packages
4+
package. Optimization.jl seeks to bring together all the optimization packages
55
it can find, local and global, into one unified Julia interface. This means, you
6-
learn one package and you learn them all! Optimization.jl adds a few high-level
6+
learn one package, and you learn them all! Optimization.jl adds a few high-level
77
features, such as integrating with automatic differentiation, to make its usage
8-
fairly simple for most cases, while allowing all of the options in a single
8+
fairly simple for most cases, while allowing all the options in a single
99
unified interface.
1010

1111
## Installation

docs/src/optimization_packages/blackboxoptim.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
# BlackBoxOptim.jl
2-
[`BlackBoxOptim`](https://github.com/robertfeldt/BlackBoxOptim.jl) is a is a Julia package implementing **(Meta-)heuristic/stochastic algorithms** that do not require for the optimized function to be differentiable.
2+
[`BlackBoxOptim`](https://github.com/robertfeldt/BlackBoxOptim.jl) is a Julia package implementing **(Meta-)heuristic/stochastic algorithms** that do not require for the optimized function to be differentiable.
33

44
## Installation: OptimizationBBO.jl
55

@@ -46,7 +46,7 @@ The currently available algorithms are listed [here](https://github.com/robertfe
4646

4747
## Example
4848

49-
The Rosenbrock function can optimized using the `BBO_adaptive_de_rand_1_bin_radiuslimited()` as follows:
49+
The Rosenbrock function can be optimized using the `BBO_adaptive_de_rand_1_bin_radiuslimited()` as follows:
5050

5151
```@example BBO
5252
using Optimization, OptimizationBBO

docs/src/optimization_packages/cmaevolutionstrategy.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ constraint equations. However, lower and upper constraints set by `lb` and `ub`
1919

2020
## Example
2121

22-
The Rosenbrock function can optimized using the `CMAEvolutionStrategyOpt()` as follows:
22+
The Rosenbrock function can be optimized using the `CMAEvolutionStrategyOpt()` as follows:
2323

2424
```@example CMAEvolutionStrategy
2525
using Optimization, OptimizationCMAEvolutionStrategy

docs/src/optimization_packages/evolutionary.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -25,11 +25,11 @@ A `Evolutionary` algorithm is called by one of the following:
2525

2626
- [`Evolutionary.CMAES()`](https://wildart.github.io/Evolutionary.jl/stable/cmaes/): **Covariance Matrix Adaptation Evolution Strategy algorithm**
2727

28-
Algorithm specific options are defined as `kwargs`. See the respective documentation for more detail.
28+
Algorithm-specific options are defined as `kwargs`. See the respective documentation for more detail.
2929

3030
## Example
3131

32-
The Rosenbrock function can optimized using the `Evolutionary.CMAES()` as follows:
32+
The Rosenbrock function can be optimized using the `Evolutionary.CMAES()` as follows:
3333

3434
```@example Evolutionary
3535
using Optimization, OptimizationEvolutionary

docs/src/optimization_packages/gcmaes.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
# GCMAES.jl
2-
[`GCMAES`](https://github.com/AStupidBear/GCMAES.jl) is a Julia package implementing the **Gradient-based Covariance Matrix Adaptation Evolutionary Strategy** which can utilize the gradient information to speed up the optimization process.
2+
[`GCMAES`](https://github.com/AStupidBear/GCMAES.jl) is a Julia package implementing the **Gradient-based Covariance Matrix Adaptation Evolutionary Strategy**, which can utilize the gradient information to speed up the optimization process.
33

44
## Installation: OptimizationGCMAES.jl
55

@@ -19,7 +19,7 @@ constraint equations. However, lower and upper constraints set by `lb` and `ub`
1919

2020
## Example
2121

22-
The Rosenbrock function can optimized using the `GCMAESOpt()` without utilizing the gradient information as follows:
22+
The Rosenbrock function can be optimized using the `GCMAESOpt()` without utilizing the gradient information as follows:
2323

2424
```@example GCMAES
2525
using Optimization, OptimizationGCMAES
@@ -31,7 +31,7 @@ prob = Optimization.OptimizationProblem(f, x0, p, lb = [-1.0,-1.0], ub = [1.0,1.
3131
sol = solve(prob, GCMAESOpt())
3232
```
3333

34-
We can also utilise the gradient information of the optimization problem to aid the optimization as follows:
34+
We can also utilize the gradient information of the optimization problem to aid the optimization as follows:
3535

3636
```@example GCMAES
3737
f = OptimizationFunction(rosenbrock, Optimization.AutoForwardDiff())

docs/src/optimization_packages/mathoptinterface.md

Lines changed: 6 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
# MathOptInterface.jl
22

3-
[MathOptInterface](https://github.com/jump-dev/MathOptInterface.jl) is Julia
4-
abstration layer to interface with variety of mathematical optimization solvers.
3+
[MathOptInterface](https://github.com/jump-dev/MathOptInterface.jl) is a Julia
4+
abstraction layer to interface with a variety of mathematical optimization solvers.
55

66
## Installation: OptimizationMOI.jl
77

@@ -16,10 +16,10 @@ import Pkg; Pkg.add("OptimizationMOI")
1616
As of now, the `Optimization` interface to `MathOptInterface` implements only
1717
the `maxtime` common keyword argument.
1818

19-
An optimizer which supports the `MathOptInterface` API can be called be called
19+
An optimizer which supports the `MathOptInterface` API can be called
2020
directly if no optimizer options have to be defined.
2121

22-
For example using the [`Ipopt.jl`](https://github.com/jump-dev/Ipopt.jl)
22+
For example, using the [`Ipopt.jl`](https://github.com/jump-dev/Ipopt.jl)
2323
optimizer:
2424

2525

@@ -29,9 +29,9 @@ sol = solve(prob, Ipopt.Optimizer())
2929
```
3030

3131
The optimizer options are handled in one of two ways. They can either be set via
32-
`OptimizationMOI.MOI.OptimizerWithAttributes()` or as keyword argument to `solve`.
32+
`OptimizationMOI.MOI.OptimizerWithAttributes()` or as keyword arguments to `solve`.
3333

34-
For example using the `Ipopt.jl` optimizer:
34+
For example, using the `Ipopt.jl` optimizer:
3535

3636
```julia
3737
using OptimizationMOI, Ipopt

docs/src/optimization_packages/metaheuristics.md

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
# Metaheuristics.jl
2-
[`Metaheuristics`](https://github.com/jmejia8/Metaheuristics.jl) is a is a Julia package implementing **metaheuristic algorithms** for global optiimization that do not require for the optimized function to be differentiable.
2+
[`Metaheuristics`](https://github.com/jmejia8/Metaheuristics.jl) is a Julia package implementing **metaheuristic algorithms** for global optimization that does not require for the optimized function to be differentiable.
33

44
## Installation: OptimizationMetaheuristics.jl
55

@@ -15,7 +15,7 @@ import Pkg; Pkg.add("OptimizationMetaheuristics")
1515
A `Metaheuristics` Single-Objective algorithm is called using one of the following:
1616

1717
* Evolutionary Centers Algorithm: `ECA()`
18-
* Differential Evolution: `DE()` with 5 different stratgies
18+
* Differential Evolution: `DE()` with 5 different strategies
1919
- `DE(strategy=:rand1)` - default strategy
2020
- `DE(strategy=:rand2)`
2121
- `DE(strategy=:best1)`
@@ -27,13 +27,13 @@ A `Metaheuristics` Single-Objective algorithm is called using one of the followi
2727
* Simulated Annealing: `SA()`
2828
* Whale Optimization Algorithm: `WOA()`
2929

30-
`Metaheuristics` also performs [`Multiobjective optimization`](https://jmejia8.github.io/Metaheuristics.jl/stable/examples/#Multiobjective-Optimization) but this is not yet supported by `Optimization`.
30+
`Metaheuristics` also performs [`Multiobjective optimization`](https://jmejia8.github.io/Metaheuristics.jl/stable/examples/#Multiobjective-Optimization), but this is not yet supported by `Optimization`.
3131

32-
Each optimizer sets default settings based on the optimization problem but specific parameters can be set as shown in the original [`Documentation`](https://jmejia8.github.io/Metaheuristics.jl/stable/algorithms/)
32+
Each optimizer sets default settings based on the optimization problem, but specific parameters can be set as shown in the original [`Documentation`](https://jmejia8.github.io/Metaheuristics.jl/stable/algorithms/)
3333

34-
Additionally, `Metaheuristics` common settings which would be defined by [`Metaheuristics.Options`](https://jmejia8.github.io/Metaheuristics.jl/stable/api/#Metaheuristics.Options) can be simply passed as special keywoard arguments to `solve` without the need to use the `Metaheuristics.Options` struct.
34+
Additionally, `Metaheuristics` common settings which would be defined by [`Metaheuristics.Options`](https://jmejia8.github.io/Metaheuristics.jl/stable/api/#Metaheuristics.Options) can be simply passed as special keyword arguments to `solve` without the need to use the `Metaheuristics.Options` struct.
3535

36-
Lastly, information about the optimization problem such as the true optimum is set via [`Metaheuristics.Information`](https://jmejia8.github.io/Metaheuristics.jl/stable/api/#Metaheuristics.Information) and passed as part of the optimizer struct to `solve` e.g. `solve(prob, ECA(information=Metaheuristics.Inoformation(f_optimum = 0.0)))`
36+
Lastly, information about the optimization problem such as the true optimum is set via [`Metaheuristics.Information`](https://jmejia8.github.io/Metaheuristics.jl/stable/api/#Metaheuristics.Information) and passed as part of the optimizer struct to `solve` e.g., `solve(prob, ECA(information=Metaheuristics.Inoformation(f_optimum = 0.0)))`
3737

3838

3939

@@ -46,7 +46,7 @@ constraint equations. However, lower and upper constraints set by `lb` and `ub`
4646

4747
## Examples
4848

49-
The Rosenbrock function can optimized using the Evolutionary Centers Algorithm `ECA()` as follows:
49+
The Rosenbrock function can be optimized using the Evolutionary Centers Algorithm `ECA()` as follows:
5050

5151
```@example Metaheuristics
5252
using Optimization, OptimizationMetaheuristics

docs/src/optimization_packages/multistartoptimization.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
# MultiStartOptimization.jl
2-
[`MultistartOptimization`](https://github.com/tpapp/MultistartOptimization.jl) is a is a Julia package implementing a global optimization multistart method which performs local optimization after choosing multiple starting points.
2+
[`MultistartOptimization`](https://github.com/tpapp/MultistartOptimization.jl) is a Julia package implementing a global optimization multistart method which performs local optimization after choosing multiple starting points.
33

44
`MultistartOptimization` requires both a global and local method to be defined. The global multistart method chooses a set of initial starting points from where local the local method starts from.
55

@@ -14,7 +14,7 @@ import Pkg; Pkg.add("OptimizationMultistartOptimization")
1414
```
1515
!!! note
1616

17-
You also need to load the relevant subpackage for the local method of you choice, for example if you plan to use one of the NLopt.jl's optimizers, you'd install and load OptimizationNLopt as described in the [NLopt.jl](@ref)'s section.
17+
You also need to load the relevant subpackage for the local method of your choice, for example if you plan to use one of the NLopt.jl's optimizers, you'd install and load OptimizationNLopt as described in the [NLopt.jl](@ref)'s section.
1818

1919
## Global Optimizer
2020
### Without Constraint Equations
@@ -24,7 +24,7 @@ constraint equations. However, lower and upper constraints set by `lb` and `ub`
2424

2525
## Examples
2626

27-
The Rosenbrock function can optimized using `MultistartOptimization.TikTak()` with 100 initial points and the local method `NLopt.LD_LBFGS()` as follows:
27+
The Rosenbrock function can be optimized using `MultistartOptimization.TikTak()` with 100 initial points and the local method `NLopt.LD_LBFGS()` as follows:
2828

2929
```@example MultiStart
3030
using Optimization, OptimizationMultistartOptimization, OptimizationNLopt
@@ -36,7 +36,7 @@ prob = Optimization.OptimizationProblem(f, x0, p, lb = [-1.0,-1.0], ub = [1.0,1.
3636
sol = solve(prob, MultistartOptimization.TikTak(100), NLopt.LD_LBFGS())
3737
```
3838

39-
You can use any `Optimization` optimizers you like. The global method of the `MultistartOptimization` is a positional argument and followed by the local method. This for example means we can perform a multistartoptimization with LBFGS as the optimizer using either the `NLopt.jl` or `Optim.jl` implementation as follows. Moreover, this interface allows you access and adjust all the optimizer settings as you normally would:
39+
You can use any `Optimization` optimizers you like. The global method of the `MultistartOptimization` is a positional argument and followed by the local method. For example, we can perform a multistartoptimization with LBFGS as the optimizer using either the `NLopt.jl` or `Optim.jl` implementation as follows. Moreover, this interface allows you to access and adjust all the optimizer settings as you normally would:
4040

4141
```@example MultiStart
4242
using OptimizationOptimJL

0 commit comments

Comments
 (0)