-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
tum merge branch #221
base: main
Are you sure you want to change the base?
tum merge branch #221
Conversation
- Stochastic optimization with stochastic constraints implementation - Tested it for "column simulation code". Graphs and other detailts in the shared document - ALso implemented a method to perform stochastic optimisation when design variables directly appear in the objective and it is not differntiable. (Variational Objective VO.py)
…ion part. Note that it is not modular completely. Just for ref.
…onSolver' into Calibration_Optimisation_HydrationSolver
# Conflicts: # environment.yml # lebedigital/simulation/precast_column.py # tests/demonstrator_scripts/test_column_simulation.py
…monstrator scripts updated
…corporated in the demonstrator_scripts and snakemake accordingly updated. The tests for the model learning/calibration routines are not properly tested yet. Same with optimization. Also the some scripts doesnt have docstring, plus testing lines here and there. Important thing is they work.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
What would be the files to execute in order to see what is implemented (test files, examples?). I usually try to start a review by looking at those to see how I can use the code, but I struggle in the 120! new files to find the right entry point. Also, the test files in demonstrator_calibration are not passing, they are removed from the automatic tests, but do not pass on my machine either. As for the paper, we maybe do not need to compile the paper, but it would be great to have a pipeline that allows to understand how the results/figures have been created. Is there something like that implemented?
Hello,
I merged with the main and added our scripts. Please complete the pull whenever time permits.