runORDCAO

De Wikicima
(Diferencias entre revisiones)
Saltar a: navegación, buscar
(Página creada con «Here are explained the instructions to run ORCHIDEE off-line simulations as it was prepared for the post-graduate course <I>'Interacció suelo-atmófera y su modelado</I> ...»)
 
(Stop simulations)
 
(No se muestran 16 ediciones intermedias realizadas por un usuario)
Línea 8: Línea 8:
 
It has been created an specific work-flow management for the course [[OR1proc OR 1proc]], this instructions follow the use of such work-flow
 
It has been created an specific work-flow management for the course [[OR1proc OR 1proc]], this instructions follow the use of such work-flow
   
# Creation of a work-flow folder
+
= Getting work-flow =
  +
  +
# Creation of a work-flow folder [oRWORKFLOW_1proc] (installOR)
 
<PRE>
 
<PRE>
 
mkdir installOR
 
mkdir installOR
Línea 18: Línea 18:
 
# Getting the files of the work-flow
 
# Getting the files of the work-flow
 
<PRE>
 
<PRE>
wget -r ftp://ftp.cima.fcen.uba.ar/lluis.fita/Curso_InteraccionSueloAtmosfera/TPs/components
+
wget -r -nH --cut-dirs=3 ftp://ftp.cima.fcen.uba.ar/lluis.fita/Curso_InteraccionSueloAtmosfera/TPs/components
 
</PRE>
 
</PRE>
# Getting forcings
+
# ftp does not charge the execution attributes. They have to be manuallry added:
 
<PRE>
 
<PRE>
wget -r ftp://ftp.cima.fcen.uba.ar/lluis.fita/Curso_InteraccionSueloAtmosfera/TPs/forcings
+
chmod u+x components/shell/run_multi-or_1proc.bash
  +
chmod u+x components/shell/prepare_forcings.bash
  +
chmod u+x components/ORtemplate/bin/orchideedriver
  +
</PRE>
  +
  +
= Getting forcings =
  +
Here we only will get a minimal sub-set of the available forcings, just to run a test case. Being in the folder [oRWORKFLOW_1proc]
  +
  +
# Creation of the atmospheric forcing data folder structure
  +
<PRE>
  +
mkdir -p forcings/ATMOS/CRUNCEP
  +
</PRE>
  +
# Getting 3 files of atmospheric forcing
  +
<PRE>
  +
cd forcings/ATMOS/CRUNCEP
  +
wget -r -nH --cut-dirs=6 ftp://ftp.cima.fcen.uba.ar/lluis.fita/Curso_InteraccionSueloAtmosfera/TPs/forcings/ATMOS/CRUNCEP/cruncep_halfdeg_1989.nc
  +
wget -r -nH --cut-dirs=6 ftp://ftp.cima.fcen.uba.ar/lluis.fita/Curso_InteraccionSueloAtmosfera/TPs/forcings/ATMOS/CRUNCEP/cruncep_halfdeg_1990.nc
  +
wget -r -nH --cut-dirs=6 ftp://ftp.cima.fcen.uba.ar/lluis.fita/Curso_InteraccionSueloAtmosfera/TPs/forcings/ATMOS/CRUNCEP/cruncep_halfdeg_1991.nc
  +
</PRE>
  +
# Going back to [oRWORKFLOW_1proc]
  +
<PRE>
  +
cd ../../../
  +
</PRE>
  +
# Creation of the ancillary data folder structure
  +
<PRE>
  +
mkdir -p forcings/IGCM/SRF
  +
</PRE>
  +
# Getting the basic ancillary data
  +
<PRE>
  +
cd forcings/IGCM/SRF
  +
wget -r -nH --cut-dirs=6 ftp://ftp.cima.fcen.uba.ar/lluis.fita/Curso_InteraccionSueloAtmosfera/TPs/forcings/IGCM/SRF/cartepente2d_15min.nc
  +
wget -r -nH --cut-dirs=6 ftp://ftp.cima.fcen.uba.ar/lluis.fita/Curso_InteraccionSueloAtmosfera/TPs/forcings/IGCM/SRF/soils_param.nc
  +
wget -r -nH --cut-dirs=6 ftp://ftp.cima.fcen.uba.ar/lluis.fita/Curso_InteraccionSueloAtmosfera/TPs/forcings/IGCM/SRF/reftemp.nc
  +
</PRE>
  +
# Ohter ancillary data (to be located from <code>forcings/IGCM/SRF</code>)
  +
<PRE>
  +
mkdir -p PFTMAPS/CMIP6/ESA-LUH2v2/historical/15PFT.v2/
  +
cd PFTMAPS/CMIP6/ESA-LUH2v2/historical/15PFT.v2/
  +
wget -r -nH --cut-dirs=11 ftp://ftp.cima.fcen.uba.ar/lluis.fita/Curso_InteraccionSueloAtmosfera/TPs/forcings/IGCM/SRF/PFTMAPS/CMIP6/ESA-LUH2v2/historical/15PFT.v2/PFTmap_1990.nc
  +
mkdir -p WOODHARVEST/LUH2v2/historical4
  +
cd WOODHARVEST/LUH2v2/historical4
  +
wget -r -nH --cut-dirs=9 ftp://ftp.cima.fcen.uba.ar/lluis.fita/Curso_InteraccionSueloAtmosfera/TPs/forcings/IGCM/SRF/WOODHARVEST/LUH2v2/historical4/woodharvest_1990.nc
  +
mkdir SOIL
  +
wget -r -nH --cut-dirs=7 ftp://ftp.cima.fcen.uba.ar/lluis.fita/Curso_InteraccionSueloAtmosfera/TPs/forcings/IGCM/SRF/SOIL/
  +
soil_bulk_and_ph.nc
  +
mkdir albedo
  +
wget -r -nH --cut-dirs=7 ftp://ftp.cima.fcen.uba.ar/lluis.fita/Curso_InteraccionSueloAtmosfera/TPs/forcings/IGCM/SRF/albedo/alb_bg_modisopt_2D_ESA_v3.nc
  +
</PRE>
  +
  +
= Running ORCHIDEE =
  +
Creation of a working directory [WORKDIR] (InteraccionSueloAtmosfera)
  +
<PRE>
  +
cd ${HOME}
  +
mkdir InteraccionSueloAtmosfera
  +
cd InteraccionSueloAtmosfera
  +
</PRE>
  +
# Creation of a test folder
  +
<PRE>
  +
mkdir test
  +
cd test
  +
</PRE>
  +
# copying the configuration file and linking the running script
  +
<PRE>
  +
cp ~/installOR/components/shell/or_1-proc.config ./
  +
ln ~/installOR/components/shell/run_multi-or_1proc.bash ./
  +
</PRE>
  +
# Editting the configuration file (or_1-proc.config). It should look like:
  +
<PRE>
  +
# scratch
  +
scratch = false
  +
  +
# Period
  +
period=1990-1990
  +
  +
# config
  +
pyBIN=/usr/bin/python
  +
OUTfolder=/home/administrador/InteraccionSueloAtmosfera/test
  +
foldOR1proc=/home/administrador/installOR
  +
forcingfolder=/home/administrador/installOR/forcings
  +
  +
# Anciliary data 1 file without year (original name of the files with the ancilliary data [folderName, since ${forcings}]@[origFilen]@[ORfilen])
  +
filestolink=IGCM/SRF/albedo@alb_bg_modisopt_2D_ESA_v3.nc@alb_bg.nc:IGCM/SRF@cartepente2d_15min.nc@cartepente2d_15min.nc:IGCM/SRF/SOIL@soil_bulk_and_ph.nc@soil_bulk_and_ph.nc:IGCM/SRF@soils_param.nc@soils_param.nc:IGCM/SRF@reftemp.nc@reftemp.nc
  +
  +
# Anciliary data 1 file with year (original name of the files with the ancilliary data [folderName, since ${forcings}]@[origFilen]@[ORfilen], [YYYY] to be changed by the correspondant year)
  +
filestolinkyr=IGCM/SRF/PFTMAPS/CMIP6/ESA-LUH2v2/historical/15PFT.v2@PFTmap_YYYY.nc@PFTmap.nc:IGCM/SRF/WOODHARVEST/LUH2v2/historical4@woodharvest_YYYY.nc@woodharvest.nc
  +
  +
# Anciliary data 3 files with year-1,year,year+1 (original name of the files with the ancilliary data [folderName, since ${forcings}]@[origFilen]@[ORfilen], [YYYY] to be changed by the correspondant year-1, year, year+1)
  +
filestolink3yr=
  +
  +
# Atmospheric forcing as [ForcingName]@[HeaderForcingFiles] [assuming ${forcingfolder}/ATMOS/{ForcingName}/{HeaderForcingFiles}_YYYY.nc]
  +
atmos=CRUNCEP@cruncep_halfdeg
  +
  +
# Additional parameters for run.def as [paramName1]:[values1];...;[paramNameN]:[valuesN] (! for spaces)
  +
addparams=LIMIT_WEST@-66.9598;LIMIT_EAST@-65.4598;LIMIT_SOUTH@-33.9648;LIMIT_NORTH@-32.4648;RIVER_ROUTING@n
  +
</PRE>
  +
  +
# Running it (running in background to liberate the terminal to check the correct running of the model)
  +
<PRE>
  +
nohup ./run_multi-or_1proc.bash >& run_multi_or_1proc.log &
  +
</PRE>
  +
  +
# In order to know if the simulation is running, at the [WORKDIR] should appear the output files and them being growing in size.
  +
  +
# When finished one should have into the <code>orout</code> folder another folder for each simulated year (in this case only 1990) with:
  +
* <code>sechiba_history_0000.nc</code>: output for the SECHIBA module
  +
* <code>stomate_history_0000.nc</code>: output for the STOMATE module
  +
* <code>sechiba_rest_out.nc</code>: restart for the SECHIBA module
  +
* <code>stomate_rest_out.nc</code>: restart for the STOMATE module
  +
* <code>out_orchidee_0000</code>: standard output from ORCHIDEE
  +
* <code>run_or.log</code>: standard output from the execution of the model
  +
* <code>run.def</code>: configuration of ORCHIDEE
  +
* <code>used_run.def</code>: full configuration (including default parameters unset in run.def) of ORCHIDEE
  +
* <code>rebuild.bash</code>: bash script to join multiple files (not used, only useful for multi-process runs)
  +
  +
# In order to know if a simulation has successfully finished, we should have more than 1 time-step in the outputs of each component (SECHIBA, STOMATE)
  +
<PRE>
  +
ncdump -h orout/1990/sechiba_history_0000.nc | grep time_counter | grep UNLIMITED
  +
time_counter = UNLIMITED ; // (365 currently)
  +
ncdump -h orout/1990/stomate_history_0000.nc | grep time_counter | grep UNLIMITED
  +
time_counter = UNLIMITED ; // (36 currently)
  +
</PRE>
  +
  +
# If something when wrong, one should check for messages (usually at the end)
  +
<PRE>
  +
tail orout/1990/out_orchidee_0000
  +
tail orout/1990/run_or.log
  +
</PRE>
  +
  +
= Stop simulations =
  +
Because model is running in 'background' one should get the PID in the example (18795) of the process:
  +
<PRE>
  +
$ ps -ef | grep multi
  +
adminis+ 18795 3560 0 12:17 pts/4 00:00:00 /bin/bash ./run_multi-or_1proc.bash
  +
adminis+ 20742 3619 0 13:54 pts/18 00:00:00 grep --color=auto multi
  +
$ kill -9 18795
  +
</PRE>
  +
Or far more drastic
  +
<PRE>
  +
$ killall run_multi-or_1proc.bash
  +
</PRE>
  +
  +
= Getting all atmospheric forcings =
  +
Now we can complete to download all the forcings
  +
  +
<PRE>
  +
cd forcings/
  +
wget -r -nH --cut-dirs=4 ftp://ftp.cima.fcen.uba.ar/lluis.fita/Curso_InteraccionSueloAtmosfera/TPs/forcings/ATMOS
 
</PRE>
 
</PRE>

Última revisión de 14:49 24 jul 2019

Here are explained the instructions to run ORCHIDEE off-line simulations as it was prepared for the post-graduate course 'Interacció suelo-atmófera y su modelado

All the necessary files are uploaded at the ftp of the course

ftp://ftp.cima.fcen.uba.ar/lluis.fita/Curso_InteraccionSueloAtmosfera/

It has been created an specific work-flow management for the course OR1proc OR 1proc, this instructions follow the use of such work-flow

Contenido

[editar] Getting work-flow

  1. Creation of a work-flow folder [oRWORKFLOW_1proc] (installOR)
mkdir installOR
  1. Going there
cd installOR
  1. Getting the files of the work-flow
wget -r -nH --cut-dirs=3 ftp://ftp.cima.fcen.uba.ar/lluis.fita/Curso_InteraccionSueloAtmosfera/TPs/components
  1. ftp does not charge the execution attributes. They have to be manuallry added:
chmod u+x components/shell/run_multi-or_1proc.bash
chmod u+x components/shell/prepare_forcings.bash
chmod u+x components/ORtemplate/bin/orchideedriver

[editar] Getting forcings

Here we only will get a minimal sub-set of the available forcings, just to run a test case. Being in the folder [oRWORKFLOW_1proc]

  1. Creation of the atmospheric forcing data folder structure
mkdir -p forcings/ATMOS/CRUNCEP
  1. Getting 3 files of atmospheric forcing
cd forcings/ATMOS/CRUNCEP
wget -r -nH --cut-dirs=6 ftp://ftp.cima.fcen.uba.ar/lluis.fita/Curso_InteraccionSueloAtmosfera/TPs/forcings/ATMOS/CRUNCEP/cruncep_halfdeg_1989.nc
wget -r -nH --cut-dirs=6 ftp://ftp.cima.fcen.uba.ar/lluis.fita/Curso_InteraccionSueloAtmosfera/TPs/forcings/ATMOS/CRUNCEP/cruncep_halfdeg_1990.nc
wget -r -nH --cut-dirs=6 ftp://ftp.cima.fcen.uba.ar/lluis.fita/Curso_InteraccionSueloAtmosfera/TPs/forcings/ATMOS/CRUNCEP/cruncep_halfdeg_1991.nc
  1. Going back to [oRWORKFLOW_1proc]
cd ../../../
  1. Creation of the ancillary data folder structure
mkdir -p forcings/IGCM/SRF
  1. Getting the basic ancillary data
cd forcings/IGCM/SRF
wget -r -nH --cut-dirs=6 ftp://ftp.cima.fcen.uba.ar/lluis.fita/Curso_InteraccionSueloAtmosfera/TPs/forcings/IGCM/SRF/cartepente2d_15min.nc
wget -r -nH --cut-dirs=6 ftp://ftp.cima.fcen.uba.ar/lluis.fita/Curso_InteraccionSueloAtmosfera/TPs/forcings/IGCM/SRF/soils_param.nc
wget -r -nH --cut-dirs=6 ftp://ftp.cima.fcen.uba.ar/lluis.fita/Curso_InteraccionSueloAtmosfera/TPs/forcings/IGCM/SRF/reftemp.nc
  1. Ohter ancillary data (to be located from forcings/IGCM/SRF)
mkdir -p PFTMAPS/CMIP6/ESA-LUH2v2/historical/15PFT.v2/
cd PFTMAPS/CMIP6/ESA-LUH2v2/historical/15PFT.v2/
wget -r -nH --cut-dirs=11 ftp://ftp.cima.fcen.uba.ar/lluis.fita/Curso_InteraccionSueloAtmosfera/TPs/forcings/IGCM/SRF/PFTMAPS/CMIP6/ESA-LUH2v2/historical/15PFT.v2/PFTmap_1990.nc
mkdir -p WOODHARVEST/LUH2v2/historical4
cd WOODHARVEST/LUH2v2/historical4
wget -r -nH --cut-dirs=9 ftp://ftp.cima.fcen.uba.ar/lluis.fita/Curso_InteraccionSueloAtmosfera/TPs/forcings/IGCM/SRF/WOODHARVEST/LUH2v2/historical4/woodharvest_1990.nc
mkdir SOIL
wget -r -nH --cut-dirs=7 ftp://ftp.cima.fcen.uba.ar/lluis.fita/Curso_InteraccionSueloAtmosfera/TPs/forcings/IGCM/SRF/SOIL/
soil_bulk_and_ph.nc
mkdir albedo
wget -r -nH --cut-dirs=7 ftp://ftp.cima.fcen.uba.ar/lluis.fita/Curso_InteraccionSueloAtmosfera/TPs/forcings/IGCM/SRF/albedo/alb_bg_modisopt_2D_ESA_v3.nc

[editar] Running ORCHIDEE

Creation of a working directory [WORKDIR] (InteraccionSueloAtmosfera)

cd ${HOME}
mkdir InteraccionSueloAtmosfera
cd InteraccionSueloAtmosfera
  1. Creation of a test folder
mkdir test
cd test
  1. copying the configuration file and linking the running script
cp ~/installOR/components/shell/or_1-proc.config ./
ln ~/installOR/components/shell/run_multi-or_1proc.bash ./
  1. Editting the configuration file (or_1-proc.config). It should look like:
# scratch
scratch = false

# Period
period=1990-1990

# config
pyBIN=/usr/bin/python
OUTfolder=/home/administrador/InteraccionSueloAtmosfera/test
foldOR1proc=/home/administrador/installOR
forcingfolder=/home/administrador/installOR/forcings

# Anciliary data 1 file without year (original name of the files with the ancilliary data [folderName, since ${forcings}]@[origFilen]@[ORfilen])
filestolink=IGCM/SRF/albedo@alb_bg_modisopt_2D_ESA_v3.nc@alb_bg.nc:IGCM/SRF@cartepente2d_15min.nc@cartepente2d_15min.nc:IGCM/SRF/SOIL@soil_bulk_and_ph.nc@soil_bulk_and_ph.nc:IGCM/SRF@soils_param.nc@soils_param.nc:IGCM/SRF@reftemp.nc@reftemp.nc

# Anciliary data 1 file with year (original name of the files with the ancilliary data [folderName, since ${forcings}]@[origFilen]@[ORfilen], [YYYY] to be changed by the correspondant year)
filestolinkyr=IGCM/SRF/PFTMAPS/CMIP6/ESA-LUH2v2/historical/15PFT.v2@PFTmap_YYYY.nc@PFTmap.nc:IGCM/SRF/WOODHARVEST/LUH2v2/historical4@woodharvest_YYYY.nc@woodharvest.nc

# Anciliary data 3 files with year-1,year,year+1 (original name of the files with the ancilliary data [folderName, since ${forcings}]@[origFilen]@[ORfilen], [YYYY] to be changed by the correspondant year-1, year, year+1)
filestolink3yr=

# Atmospheric forcing as [ForcingName]@[HeaderForcingFiles] [assuming ${forcingfolder}/ATMOS/{ForcingName}/{HeaderForcingFiles}_YYYY.nc]
atmos=CRUNCEP@cruncep_halfdeg

# Additional parameters for run.def as [paramName1]:[values1];...;[paramNameN]:[valuesN] (! for spaces)
addparams=LIMIT_WEST@-66.9598;LIMIT_EAST@-65.4598;LIMIT_SOUTH@-33.9648;LIMIT_NORTH@-32.4648;RIVER_ROUTING@n
  1. Running it (running in background to liberate the terminal to check the correct running of the model)
nohup ./run_multi-or_1proc.bash >& run_multi_or_1proc.log &
  1. In order to know if the simulation is running, at the [WORKDIR] should appear the output files and them being growing in size.
  1. When finished one should have into the orout folder another folder for each simulated year (in this case only 1990) with:
  • sechiba_history_0000.nc: output for the SECHIBA module
  • stomate_history_0000.nc: output for the STOMATE module
  • sechiba_rest_out.nc: restart for the SECHIBA module
  • stomate_rest_out.nc: restart for the STOMATE module
  • out_orchidee_0000: standard output from ORCHIDEE
  • run_or.log: standard output from the execution of the model
  • run.def: configuration of ORCHIDEE
  • used_run.def: full configuration (including default parameters unset in run.def) of ORCHIDEE
  • rebuild.bash: bash script to join multiple files (not used, only useful for multi-process runs)
  1. In order to know if a simulation has successfully finished, we should have more than 1 time-step in the outputs of each component (SECHIBA, STOMATE)
ncdump -h orout/1990/sechiba_history_0000.nc | grep time_counter | grep UNLIMITED
	time_counter = UNLIMITED ; // (365 currently)
ncdump -h orout/1990/stomate_history_0000.nc | grep time_counter | grep UNLIMITED
	time_counter = UNLIMITED ; // (36 currently)
  1. If something when wrong, one should check for messages (usually at the end)
tail orout/1990/out_orchidee_0000
tail orout/1990/run_or.log

[editar] Stop simulations

Because model is running in 'background' one should get the PID in the example (18795) of the process:

$ ps -ef | grep multi
adminis+ 18795  3560  0 12:17 pts/4    00:00:00 /bin/bash ./run_multi-or_1proc.bash
adminis+ 20742  3619  0 13:54 pts/18   00:00:00 grep --color=auto multi
$ kill -9 18795

Or far more drastic

$ killall run_multi-or_1proc.bash

[editar] Getting all atmospheric forcings

Now we can complete to download all the forcings

cd forcings/
wget -r -nH --cut-dirs=4 ftp://ftp.cima.fcen.uba.ar/lluis.fita/Curso_InteraccionSueloAtmosfera/TPs/forcings/ATMOS
Herramientas personales