runORDCAO
De Wikicima
(Diferencias entre revisiones)
(→Getting forcings) |
(→Stop simulations) |
||
Línea 159: | Línea 159: | ||
adminis+ 20742 3619 0 13:54 pts/18 00:00:00 grep --color=auto multi |
adminis+ 20742 3619 0 13:54 pts/18 00:00:00 grep --color=auto multi |
||
$ kill -9 18795 |
$ kill -9 18795 |
||
+ | </PRE> |
||
+ | Or far more drastic |
||
+ | <PRE> |
||
+ | $ killall run_multi-or_1proc.bash |
||
</PRE> |
</PRE> |
||
Última revisión de 14:49 24 jul 2019
Here are explained the instructions to run ORCHIDEE off-line simulations as it was prepared for the post-graduate course 'Interacció suelo-atmófera y su modelado
All the necessary files are uploaded at the ftp of the course
ftp://ftp.cima.fcen.uba.ar/lluis.fita/Curso_InteraccionSueloAtmosfera/
It has been created an specific work-flow management for the course OR1proc OR 1proc, this instructions follow the use of such work-flow
Contenido |
[editar] Getting work-flow
- Creation of a work-flow folder [oRWORKFLOW_1proc] (installOR)
mkdir installOR
- Going there
cd installOR
- Getting the files of the work-flow
wget -r -nH --cut-dirs=3 ftp://ftp.cima.fcen.uba.ar/lluis.fita/Curso_InteraccionSueloAtmosfera/TPs/components
- ftp does not charge the execution attributes. They have to be manuallry added:
chmod u+x components/shell/run_multi-or_1proc.bash chmod u+x components/shell/prepare_forcings.bash chmod u+x components/ORtemplate/bin/orchideedriver
[editar] Getting forcings
Here we only will get a minimal sub-set of the available forcings, just to run a test case. Being in the folder [oRWORKFLOW_1proc]
- Creation of the atmospheric forcing data folder structure
mkdir -p forcings/ATMOS/CRUNCEP
- Getting 3 files of atmospheric forcing
cd forcings/ATMOS/CRUNCEP wget -r -nH --cut-dirs=6 ftp://ftp.cima.fcen.uba.ar/lluis.fita/Curso_InteraccionSueloAtmosfera/TPs/forcings/ATMOS/CRUNCEP/cruncep_halfdeg_1989.nc wget -r -nH --cut-dirs=6 ftp://ftp.cima.fcen.uba.ar/lluis.fita/Curso_InteraccionSueloAtmosfera/TPs/forcings/ATMOS/CRUNCEP/cruncep_halfdeg_1990.nc wget -r -nH --cut-dirs=6 ftp://ftp.cima.fcen.uba.ar/lluis.fita/Curso_InteraccionSueloAtmosfera/TPs/forcings/ATMOS/CRUNCEP/cruncep_halfdeg_1991.nc
- Going back to [oRWORKFLOW_1proc]
cd ../../../
- Creation of the ancillary data folder structure
mkdir -p forcings/IGCM/SRF
- Getting the basic ancillary data
cd forcings/IGCM/SRF wget -r -nH --cut-dirs=6 ftp://ftp.cima.fcen.uba.ar/lluis.fita/Curso_InteraccionSueloAtmosfera/TPs/forcings/IGCM/SRF/cartepente2d_15min.nc wget -r -nH --cut-dirs=6 ftp://ftp.cima.fcen.uba.ar/lluis.fita/Curso_InteraccionSueloAtmosfera/TPs/forcings/IGCM/SRF/soils_param.nc wget -r -nH --cut-dirs=6 ftp://ftp.cima.fcen.uba.ar/lluis.fita/Curso_InteraccionSueloAtmosfera/TPs/forcings/IGCM/SRF/reftemp.nc
- Ohter ancillary data (to be located from
forcings/IGCM/SRF
)
mkdir -p PFTMAPS/CMIP6/ESA-LUH2v2/historical/15PFT.v2/ cd PFTMAPS/CMIP6/ESA-LUH2v2/historical/15PFT.v2/ wget -r -nH --cut-dirs=11 ftp://ftp.cima.fcen.uba.ar/lluis.fita/Curso_InteraccionSueloAtmosfera/TPs/forcings/IGCM/SRF/PFTMAPS/CMIP6/ESA-LUH2v2/historical/15PFT.v2/PFTmap_1990.nc mkdir -p WOODHARVEST/LUH2v2/historical4 cd WOODHARVEST/LUH2v2/historical4 wget -r -nH --cut-dirs=9 ftp://ftp.cima.fcen.uba.ar/lluis.fita/Curso_InteraccionSueloAtmosfera/TPs/forcings/IGCM/SRF/WOODHARVEST/LUH2v2/historical4/woodharvest_1990.nc mkdir SOIL wget -r -nH --cut-dirs=7 ftp://ftp.cima.fcen.uba.ar/lluis.fita/Curso_InteraccionSueloAtmosfera/TPs/forcings/IGCM/SRF/SOIL/ soil_bulk_and_ph.nc mkdir albedo wget -r -nH --cut-dirs=7 ftp://ftp.cima.fcen.uba.ar/lluis.fita/Curso_InteraccionSueloAtmosfera/TPs/forcings/IGCM/SRF/albedo/alb_bg_modisopt_2D_ESA_v3.nc
[editar] Running ORCHIDEE
Creation of a working directory [WORKDIR] (InteraccionSueloAtmosfera)
cd ${HOME} mkdir InteraccionSueloAtmosfera cd InteraccionSueloAtmosfera
- Creation of a test folder
mkdir test cd test
- copying the configuration file and linking the running script
cp ~/installOR/components/shell/or_1-proc.config ./ ln ~/installOR/components/shell/run_multi-or_1proc.bash ./
- Editting the configuration file (or_1-proc.config). It should look like:
# scratch scratch = false # Period period=1990-1990 # config pyBIN=/usr/bin/python OUTfolder=/home/administrador/InteraccionSueloAtmosfera/test foldOR1proc=/home/administrador/installOR forcingfolder=/home/administrador/installOR/forcings # Anciliary data 1 file without year (original name of the files with the ancilliary data [folderName, since ${forcings}]@[origFilen]@[ORfilen]) filestolink=IGCM/SRF/albedo@alb_bg_modisopt_2D_ESA_v3.nc@alb_bg.nc:IGCM/SRF@cartepente2d_15min.nc@cartepente2d_15min.nc:IGCM/SRF/SOIL@soil_bulk_and_ph.nc@soil_bulk_and_ph.nc:IGCM/SRF@soils_param.nc@soils_param.nc:IGCM/SRF@reftemp.nc@reftemp.nc # Anciliary data 1 file with year (original name of the files with the ancilliary data [folderName, since ${forcings}]@[origFilen]@[ORfilen], [YYYY] to be changed by the correspondant year) filestolinkyr=IGCM/SRF/PFTMAPS/CMIP6/ESA-LUH2v2/historical/15PFT.v2@PFTmap_YYYY.nc@PFTmap.nc:IGCM/SRF/WOODHARVEST/LUH2v2/historical4@woodharvest_YYYY.nc@woodharvest.nc # Anciliary data 3 files with year-1,year,year+1 (original name of the files with the ancilliary data [folderName, since ${forcings}]@[origFilen]@[ORfilen], [YYYY] to be changed by the correspondant year-1, year, year+1) filestolink3yr= # Atmospheric forcing as [ForcingName]@[HeaderForcingFiles] [assuming ${forcingfolder}/ATMOS/{ForcingName}/{HeaderForcingFiles}_YYYY.nc] atmos=CRUNCEP@cruncep_halfdeg # Additional parameters for run.def as [paramName1]:[values1];...;[paramNameN]:[valuesN] (! for spaces) addparams=LIMIT_WEST@-66.9598;LIMIT_EAST@-65.4598;LIMIT_SOUTH@-33.9648;LIMIT_NORTH@-32.4648;RIVER_ROUTING@n
- Running it (running in background to liberate the terminal to check the correct running of the model)
nohup ./run_multi-or_1proc.bash >& run_multi_or_1proc.log &
- In order to know if the simulation is running, at the [WORKDIR] should appear the output files and them being growing in size.
- When finished one should have into the
orout
folder another folder for each simulated year (in this case only 1990) with:
-
sechiba_history_0000.nc
: output for the SECHIBA module -
stomate_history_0000.nc
: output for the STOMATE module -
sechiba_rest_out.nc
: restart for the SECHIBA module -
stomate_rest_out.nc
: restart for the STOMATE module -
out_orchidee_0000
: standard output from ORCHIDEE -
run_or.log
: standard output from the execution of the model -
run.def
: configuration of ORCHIDEE -
used_run.def
: full configuration (including default parameters unset in run.def) of ORCHIDEE -
rebuild.bash
: bash script to join multiple files (not used, only useful for multi-process runs)
- In order to know if a simulation has successfully finished, we should have more than 1 time-step in the outputs of each component (SECHIBA, STOMATE)
ncdump -h orout/1990/sechiba_history_0000.nc | grep time_counter | grep UNLIMITED time_counter = UNLIMITED ; // (365 currently) ncdump -h orout/1990/stomate_history_0000.nc | grep time_counter | grep UNLIMITED time_counter = UNLIMITED ; // (36 currently)
- If something when wrong, one should check for messages (usually at the end)
tail orout/1990/out_orchidee_0000 tail orout/1990/run_or.log
[editar] Stop simulations
Because model is running in 'background' one should get the PID in the example (18795) of the process:
$ ps -ef | grep multi adminis+ 18795 3560 0 12:17 pts/4 00:00:00 /bin/bash ./run_multi-or_1proc.bash adminis+ 20742 3619 0 13:54 pts/18 00:00:00 grep --color=auto multi $ kill -9 18795
Or far more drastic
$ killall run_multi-or_1proc.bash
[editar] Getting all atmospheric forcings
Now we can complete to download all the forcings
cd forcings/ wget -r -nH --cut-dirs=4 ftp://ftp.cima.fcen.uba.ar/lluis.fita/Curso_InteraccionSueloAtmosfera/TPs/forcings/ATMOS