Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update doc discussion of parameter typing and check tests #1011

Open
isaacsas opened this issue Aug 2, 2024 · 9 comments
Open

Update doc discussion of parameter typing and check tests #1011

isaacsas opened this issue Aug 2, 2024 · 9 comments
Labels

Comments

@isaacsas
Copy link
Member

isaacsas commented Aug 2, 2024

Given that MTK has changed behavior now and is promoting all tunables to a common type, the doc discussion of parameter typing needs to be checked and updated accordingly (or perhaps
just removed until this is all clearly stabilized in the future). Likewise tests need to be checked that they are still testing what they were intended to test (even if passing).

@isaacsas isaacsas added the bug label Aug 2, 2024
@TorkelE
Copy link
Member

TorkelE commented Aug 2, 2024

Sounds good. If I understand it correctly, the big difference is that any non-default types cannot be tunable (and we should put a note about that wherever relevant)?

@ChrisRackauckas
Copy link
Member

Tunables are always kept a vector, so if you say remake some to dual then it will convert them all to dual. This of course is a requirement for any AD or optimization library, and this is also a requirement for any adjoint overload.

@isaacsas
Copy link
Member Author

isaacsas commented Aug 2, 2024

I think it is that integer type parameters are not tunable and can be declared as we currently do. Any mixture of float type parameters are considered tunable and promoted to a common type. So mixing Float32 and 64 symbolics is will result in all one type ultimately (I guess Float64).

@ChrisRackauckas
Copy link
Member

@AayushSabharwal can we add a check in MTK that any tunable has a symtype of AbstractFloat?

@ChrisRackauckas
Copy link
Member

Any mixture of float type parameters are considered tunable and promoted to a common type. So mixing Float32 and 64 symbolics is will result in all one type ultimately (I guess Float64).

Yes exactly, it will get the promotion of all tunables.

@ChrisRackauckas
Copy link
Member

Which BTW is the same semantics of the AD and optimization libraries. It's just matching that behavior to ensure compatability of the generated functions. So it's not symbolically possible to generate an MTK model for which the tunable set is not compatible with actually being tuned. That is the major change.

@TorkelE
Copy link
Member

TorkelE commented Aug 2, 2024

Basically it seems as long as one doesn't mix no-default parameter types and anything which requires AD/tuning, one should be fine. And if one do, one should know what one is doing.

Would it make sense to have a more detailed section in MTK docs, and then we can link that from Catalyst where relevant?

@ChrisRackauckas
Copy link
Member

We will need to write one, yes.

@AayushSabharwal
Copy link
Member

@AayushSabharwal can we add a check in MTK that any tunable has a symtype of AbstractFloat?

We already do. Since @parameters by default has a symtype of Real and forcing people to annotate ::AbstractFloat would be breaking, it checks if symtype == Real || symtype <: AbstractFloat (and the equivalent for arrays). Any variable not passing this is put in the constants portion.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants