You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
i am attempting to write a python version of the function MBCn so i can use xarray to carry out the computation over a a gridded file.
the R function accepts matrices (size tn by varn) as inputs where tn is the number of timesteps and varn is the number of variables of the dataset. However that same function, great as it is, only accepts point values (inputs for one (lon,lat) pair).
The images below is a comparison of the code above with official R version for one iteration:
(top: rainfall, bottom: temp)
However over 30 iterations (the default number of iterations in the R version) the python script deteriorates but R is stable:
(top: rainfall, bottom: temp)
The function quantileDeltaMapping attached below as BA.txt has its own version in R which, after triple-checking, duplicates the R version perfectly so there is no problem there. I am assuming that the problem lies entirely with the piece of code I wrote above.
I should also mention that the R version performs a bias adjustment (called QDM) if the model inputs (mod.c and mod.p in R code) has not been adjusted yet via QDM so the inputs I passed to nMultivarBC has already been adjusted (datasets i used are attached as a zip file)
BA.txt
(attached data_files.zip
I have specified an extra argument in the function nMultiVarBC above so the user can specify if they wish to adjust historical or model projections need to adjusted (i did this to minimize the computational burden that would be encountered when inputting gridded files)
May someone kindly point out how and why things go awry after several iterations?
Your help would be greatly appreciated.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
i am attempting to write a python version of the function MBCn so i can use xarray to carry out the computation over a a gridded file.
the R function accepts matrices (size tn by varn) as inputs where tn is the number of timesteps and varn is the number of variables of the dataset. However that same function, great as it is, only accepts point values (inputs for one (lon,lat) pair).
My code is as follows:
The images below is a comparison of the code above with official R version for one iteration:
(top: rainfall, bottom: temp)
However over 30 iterations (the default number of iterations in the R version) the python script deteriorates but R is stable:
(top: rainfall, bottom: temp)
The function quantileDeltaMapping attached below as BA.txt has its own version in R which, after triple-checking, duplicates the R version perfectly so there is no problem there. I am assuming that the problem lies entirely with the piece of code I wrote above.
I should also mention that the R version performs a bias adjustment (called QDM) if the model inputs (mod.c and mod.p in R code) has not been adjusted yet via QDM so the inputs I passed to nMultivarBC has already been adjusted (datasets i used are attached as a zip file)
BA.txt
(attached
data_files.zip
I have specified an extra argument in the function
nMultiVarBC
above so the user can specify if they wish to adjust historical or model projections need to adjusted (i did this to minimize the computational burden that would be encountered when inputting gridded files)May someone kindly point out how and why things go awry after several iterations?
Your help would be greatly appreciated.
best,
oxdub
Beta Was this translation helpful? Give feedback.
All reactions