-
Notifications
You must be signed in to change notification settings - Fork 37
Batch API #521
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Batch API #521
Conversation
| @test_throws(MethodError, cons_lin!(model, [0.0], [1.0])) | ||
| @test_throws(MethodError, cons_nln!(model, [0.0], [1.0])) | ||
| @test_throws(MethodError, jac_lin_coord!(model, [0.0], [1.0])) | ||
| @test_throws(MethodError, jac_lin_coord!(model, [1.0])) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is not related, just hides the deprecation warning (#501)
|
cc @frapac |
|
Docs are broken due to #520 |
|
Hi @klamike ! Can you expand a bit more on the context for this and potential use of it? |
|
@tmigot The idea is to define an API to talk about batches of models in a standardized way across JSO. Then each solver (up for discussion -- maybe we should do that here instead?) can define its own batch model types, which would mark the relevant assumptions (e.g. same jac/hess structure, same nvar/ncon, etc.) to enable exploiting that shared structure (importantly, not just in the evaluation of the model, but also in the solvers themselves). For example, if we assume the same jac/hess structure, we can use CUDSS uniform batch API for solving KKT systems instead of looping over single solves. This should bring substantial speedup over something like To be honest, this kind of change is very big for me to think about all the consequences of different design decisions at once. So, we decided it might be best to just implement something dumb first to get some visibility into tradeoffs. |
| isempty(updates) && error("Cannot create InplaceBatchNLPModel from empty collection.") | ||
| InplaceBatchNLPModel{M}(base_model, updates, Counters(), length(updates)) | ||
| end | ||
| # TODO: counters? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Note the counters are not properly wired up yet, both here and in foreach
| ] | ||
| @test hprods ≈ manual_hprods | ||
|
|
||
| if isa(bnlp, ForEachBatchNLPModel) # NOTE: excluding InplaceBatchNLPModel |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Had to exclude the inplace version here since the closures in the operators reference the model state, which we are mutating without their knowledge
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It should be doable to put the update function inside the operator closure..
| length(xs) == 0 && error("Cannot call _batch_map! without providing arguments.") | ||
| @lencheck_tup n xs | ||
| outputs = xs[end] | ||
| inputs = length(xs) == 1 ? () : Base.ntuple(i -> xs[i], length(xs) - 1) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am hoping that making Julia specialize on the N in Vararg{Any,N} will make this compile down to something reasonable
@amontoison
Here is what I have so far for the batch API and some "dumb" implementations of it,
ForEachBatchNLPModelandInplaceBatchNLPModelas discussed. I'm opening this mostly to start discussion about how best to do batches of models, not necessarily suggesting we merge it as is (after all, it is "dumb" 😉).ForEachBatchNLPModeljust loops over the children's functions, andInplaceBatchNLPModelmakes users passupdates, a list of functions that update abase_modelto a batch element. The idea is if you are just varying e.g. thelcon, you can do something like