Skip to content

Commit acd692b

Browse files
add a new 1st tutorial
1 parent fb262ab commit acd692b

File tree

5 files changed

+384
-8
lines changed

5 files changed

+384
-8
lines changed

docs/make.jl

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -11,6 +11,7 @@ makedocs(
1111
pages=[
1212
"Home" => "index.md",
1313
"Tutorials" => Any[
14+
"tutorials/symbolic_functions.md",
1415
"tutorials/ode_modeling.md",
1516
"tutorials/higher_order.md",
1617
"tutorials/nonlinear.md",
@@ -28,6 +29,7 @@ makedocs(
2829
"systems/PDESystem.md",
2930
"systems/DependencyGraphs.md"
3031
],
32+
"Comparison Against SymPy" => "comparison.md",
3133
"highlevel.md",
3234
"IR.md"
3335
]

docs/src/IR.md

Lines changed: 1 addition & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -47,12 +47,9 @@ out all of the differentials, the `expand_derivatives` function eliminates all
4747
of the differentials down to basic one-variable expressions.
4848

4949
```@docs
50+
ModelingToolkit.derivative
5051
Differential
5152
expand_derivatives
52-
ModelingToolkit.derivative
53-
ModelingToolkit.gradient
54-
ModelingToolkit.jacobian
55-
ModelingToolkit.hessian
5653
```
5754

5855
Note that the generation of sparse matrices simply follows from the Julia semantics

docs/src/comparison.md

Lines changed: 59 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,59 @@
1+
# Comparison of Julia's ModelingToolkit vs SymPy for Symbolic Computation
2+
3+
ModelingToolkit.jl is a symbolic modeling language for Julia built in
4+
Julia. Its goal is very different from Sympy: it was made to support
5+
symbolic-numerics, the combination of symbolic computing with numerical
6+
methods to allow for extreme performance computing that would not be
7+
possible without modifying the model. Because of this, ModelingToolkit.jl
8+
excels in many areas due to purposeful design decisions:
9+
10+
- Performance: ModelingToolkit.jl is built in Julia, whereas SymPy was
11+
built in Python. Thus the performance bar for ModelingToolkit.jl is
12+
much higher. ModelingToolkit.jl started because SymPy was far too
13+
slow and SymEngine was far too inflexible for the projects they were
14+
doing. Performance is key to ModelingToolkit.jl. If you find any
15+
performance issues, please file an issue.
16+
- build_function: `lambdify` is "fine" for some people, but if you're building
17+
a super fast MPI-enabled Julia/C/Fortran simulation code, having a
18+
function that hits the Python interpreter is less than optimal. By
19+
default, `build_function` builds fast JIT-compiled functions due
20+
to being in Julia. However, it has support for things like static
21+
arrays, non-allocating functions via mutation, fast functions on
22+
sparse matrices and arrays of arrays, etc.: all core details of
23+
doing high performance computing.
24+
- Parallelism: ModelingToolkit.jl has pervasive parallelism. The
25+
symbolic simplification via [SymbolicUtils.jl](https://github.com/JuliaSymbolics/SymbolicUtils.jl)
26+
has built-in parallelism, ModelingToolkit.jl builds functions that
27+
parallelizes across threads and multiprocesses across clusters,
28+
and it has dynamic scheduling through tools like [Dagger.jl](https://github.com/JuliaParallel/Dagger.jl).
29+
ModelingToolkit.jl is compatible with GPU libraries like CUDA.jl.
30+
- Scientific Machine Learning (SciML): ModelingToolkit.jl is made to synergize
31+
with the high performance Julia SciML ecosystem in many ways. At a
32+
base level, all expressions and built functions are compatible with
33+
automatic differentiation like ForwardDiff.jl and Zygote.jl, meaning
34+
that it can be used in and with neural networks. Tools like
35+
[DataDrivenDiffEq.jl](https://datadriven.sciml.ai/dev/) can reconstruct
36+
symbolic expressions from neural networks and data while
37+
[NeuralNetDiffEq.jl](https://github.com/SciML/NeuralNetDiffEq.jl)
38+
can automatically solve partial differential equations from symbolic
39+
descriptions using physics-informed neural networks.
40+
- Primitives for high-performance numerics. Features like `ODESystem`
41+
can be used to easily generate automatically parallelized ODE solver
42+
code with sparse Jacobians and all of the pieces required to get
43+
the most optimal solves. Support for differential-algebraic equations,
44+
chemical reaction networks, and generation of code for nonlinear
45+
optimization tools makes ModelingToolkit.jl a tool for, well,
46+
building, generating, and analyzing models.
47+
- Deep integration with the Julia ecosystem: ModelingToolkit.jl's integration
48+
with neural networks is not the only thing that's deep. ModelingToolkit.jl
49+
is built with the same philosophy as other SciML packages, eschewing
50+
"monorepos" for a distributed development approach that ties together
51+
the work of many developers. The differentiation parts utilize tools
52+
from automatic differentiation libraries, all linear algebra functionality
53+
comes from tracing Julia Base itself, symbolic rewriting (simplification
54+
and substitution) comes from
55+
[SymbolicUtils.jl](https://github.com/JuliaSymbolics/SymbolicUtils.jl),
56+
parallelism comes from Julia Base libraries and Dagger.jl, and etc.
57+
The list keeps going. All told, by design ModelingToolkit.jl's development
58+
moves fast because it's effectively using the work of hundreds of
59+
Julia developers, allowing it to grow fast.

docs/src/highlevel.md

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -15,6 +15,14 @@ Base.:~(::Expression, ::Expression)
1515
modelingtoolkitize
1616
```
1717

18+
## Differentiation Functions
19+
20+
```@docs
21+
ModelingToolkit.gradient
22+
ModelingToolkit.jacobian
23+
ModelingToolkit.hessian
24+
```
25+
1826
## Additional High-Level Explanations and Tips
1927

2028
### The Auto-Detecting System Constructors

0 commit comments

Comments
 (0)