WRF

De Wikicima
(Diferencias entre revisiones)
Saltar a: navegación, buscar
 
(No se muestran 91 ediciones intermedias realizadas por 3 usuarios)
Línea 2: Línea 2:
 
Weather Research and Forecast model ([http://www2.mmm.ucar.edu/wrf/users/ WRF], Skamarock, 2008) is a limited area atmospheric model developed by a consortium of american institutions with contributions from the whole community. It is a non-hydrostatic primitive equations model used in a large variety of research areas.
 
Weather Research and Forecast model ([http://www2.mmm.ucar.edu/wrf/users/ WRF], Skamarock, 2008) is a limited area atmospheric model developed by a consortium of american institutions with contributions from the whole community. It is a non-hydrostatic primitive equations model used in a large variety of research areas.
   
  +
Techincal aspects of the model (v 4.0) are provided in this [http://www2.mmm.ucar.edu/wrf/users/docs/technote/v4_technote.pdf pdf]
   
= Compilation =
+
= WRF and CIMA =
  +
CIMA has an HPC called <code>hydra</code> to perform high intensive modelling efforts with WRF.
  +
  +
== hydra ==
  +
The HPC is part of the Servicio Nacional de Cómputo de Alta Capacidad (SNCAD).
  +
  +
It has been re-installed and re-sorted in Jan 2022 and it has been adapted to common configurations in other international research centers.
  +
  +
  +
=== Selection of compilers & libraries ===
  +
  +
It does not have the <code>module</code> software to manage compilers and libraries, but it does have an script which allows users to select the compiler. This is done at the terminal with the script <code>source /opt/load-libs.sh</code>
  +
  +
Example selecting intel 2021.4.0, mpich and netcdf 4
  +
  +
<pre style="shell">
  +
$ source /opt/load-libs.sh
  +
  +
Available libraries:
  +
1) INTEL 2021.4.0: MPICH 3.4.2, NetCDF 4, HDF5 1.10.5, JASPER 2.0.33
  +
2) INTEL 2021.4.0: OpenMPI 4.1.2, NetCDF 4, HDF5 1.10.5, JASPER 2.0.33
  +
3) GNU 10.2.1: MPICH 3.4.2, NetCDF 4, HDF5 1.10.5, JASPER 2.0.33
  +
4) GNU 10.2.1: OpenMPI 4.1.2, NetCDF 4, HDF5 1.10.5, JASPER 2.0.33
  +
0) Exit
  +
Choose an option: 1
  +
  +
The following libraries, compiled with Intel 2021.4.0 compilers, were loaded:
  +
* MPICH 3.4.2
  +
* NetCDF 4
  +
* HDF5 1.10.5
  +
* JASPER 2.0.33
  +
  +
To change it please logout and login again.
  +
  +
To load this libraries from within a script add this line to script:
  +
source /opt/load-libs.sh 1
  +
</pre>
  +
  +
  +
  +
== Compilation ==
  +
Multiple different versions are available and pre-compiled in <code>hydra</code>, with the folloing structure:
  +
  +
<pre style="shell">
  +
/opt/wrf/WRF-{version}/{compiler}/{compiler-version}/{architecture}/{WRF/WPS}
  +
</pre>
  +
  +
<!--
 
Model (version 3.9.1) has already been compiled in <code>hydra</code>
 
Model (version 3.9.1) has already been compiled in <code>hydra</code>
  +
  +
For compilation using the gcc copmilers see here [[WRFgcc]]
   
 
* From an instalation folder <code>[INSTALLDIR]</code> (/share/WRF, in <code>hydra</code>), creation of a folder structure
 
* From an instalation folder <code>[INSTALLDIR]</code> (/share/WRF, in <code>hydra</code>), creation of a folder structure
Línea 36: Línea 39:
 
** Using the modified file
 
** Using the modified file
 
<pre>$ cp configure.ifort.dmpar.wrf configure.wrf
 
<pre>$ cp configure.ifort.dmpar.wrf configure.wrf
$ ./compile em_real >\& compile.log</pre>
+
$ ./compile em_real >& compile.log</pre>
 
** If everything went fine one should have
 
** If everything went fine one should have
 
<pre>$ ls main/*.exe
 
<pre>$ ls main/*.exe
Línea 47: Línea 50:
 
<pre>cd ../WPS</pre>
 
<pre>cd ../WPS</pre>
 
** Running configure and picking up 20 (<code>Linux x86_64, Intel compiler (dmpar_NO_GRIB2)</code>)
 
** Running configure and picking up 20 (<code>Linux x86_64, Intel compiler (dmpar_NO_GRIB2)</code>)
<pre>configure</pre>
+
<pre>./configure</pre>
 
** Compiling
 
** Compiling
 
<pre>./compile >& compile.log</pre>
 
<pre>./compile >& compile.log</pre>
Línea 57: Línea 60:
 
ungrib/g2print.exe util/g2print.exe
 
ungrib/g2print.exe util/g2print.exe
 
ungrib/ungrib.exe util/height_ukmo.exe</pre>
 
ungrib/ungrib.exe util/height_ukmo.exe</pre>
  +
-->
   
= Forcings =
+
== Forcings ==
  +
  +
=== Atmospheric forcings ===
  +
This will change once <code>papa-deimos</code> is fully operational
   
== Atmospheric forcings ==
 
 
Provide the atmospheric conditions to the model at a given date.
 
Provide the atmospheric conditions to the model at a given date.
   
Línea 67: Línea 70:
 
<pre>/share/DATA/</pre>
 
<pre>/share/DATA/</pre>
 
* ''ERA-Interim''
 
* ''ERA-Interim''
** Thus, part of <code>ERA-Interim</code> forcings are:
+
** Thus, part of [https://www.ecmwf.int/en/forecasts/datasets/reanalysis-datasets/era-interim ERA-Interim] forcings are:
 
<pre>/share/DATA/re-analysis/ERA-Interim/</pre>
 
<pre>/share/DATA/re-analysis/ERA-Interim/</pre>
 
** Global monthly files at 0.75&deg; horizontal resolution and all time-steps: 00, 06, 12, 18 are labelled as:
 
** Global monthly files at 0.75&deg; horizontal resolution and all time-steps: 00, 06, 12, 18 are labelled as:
Línea 86: Línea 89:
 
<pre>$ mv _mars-atls05-a82bacafb5c306db76464bc7e824bb75-zn7P44.grib ERAI_sfc201302.grib</pre>
 
<pre>$ mv _mars-atls05-a82bacafb5c306db76464bc7e824bb75-zn7P44.grib ERAI_sfc201302.grib</pre>
   
== Morphological forcings ==
+
* ''ERA5''
  +
** Thus, part of [https://www.ecmwf.int/en/forecasts/datasets/reanalysis-datasets/era5 ERA5] forcings are:
  +
<pre>/share/DATA/re-analysis/ERA5/</pre>
  +
  +
* 'NCEP-NNRP1'
  +
** Thus, full [https://www.esrl.noaa.gov/psd/data/gridded/data.ncep.reanalysis.html NCEP-NNRP1] forcings are ('''NOTE: no land data!!!'''):
  +
<pre>/share/DATA/re-analysis/NCEP_NNRP/</pre>
  +
  +
  +
=== Morphological forcings ===
 
Provide the geomorphological information for the domain of simulation: topography, land-use, vegetation-types, etc...
 
Provide the geomorphological information for the domain of simulation: topography, land-use, vegetation-types, etc...
   
Línea 93: Línea 96:
 
* They are ready to be use
 
* They are ready to be use
   
= Model use =
+
= GENERIC Model use =
 
WRF has two main parts:
 
WRF has two main parts:
 
* ''WPS'': Generation of the domain, initial and boundary condition: Runs 4 programs:
 
* ''WPS'': Generation of the domain, initial and boundary condition: Runs 4 programs:
Línea 102: Línea 105:
 
* ''WRF'': model it self: <code>wrf.exe</code>
 
* ''WRF'': model it self: <code>wrf.exe</code>
   
At <code>hydra</code> all the code is already compiled at (with folders <code>WPS</code> and <code>WRFV3</code>:
+
At <code>hydra</code> all the code is already compiled at <code>/opt/wrf</code> (with folders <code>WPS</code> and <code>WRFV3</code>)
<pre>/share/WRF/[WRFversion]/[compiler]/[compilation_kind]/</pre>
+
As example WRFv4.3.3 compiled with intel-compilers with distributed and shared memory
As example WRFv3.9.1 compiled with intel-compilers with distributed memory only
+
<pre>/opt/wrf/WRF-4.3.3/intel/2021.4.0/dm+sm/WRF</pre>
<pre>/share/WRF/WRFV3.9.1/ifort/dmpar/</pre>
 
   
 
== WPS ==
 
== WPS ==
Let's assume that we work in a folder called <code>WORKDIR</code> (at users' <code>${HOME}</code> at <code>hydra</code>). As example, let's create a domain at ''25 km'' for SESA only
+
Let's assume that we work in a folder called <code>$WORKDIR</code> (at users' <code>${HOME}</code> at <code>hydra</code>) and we will work with a given WRF version located in <code>$WRFversion</code>. As example, let's create a two nested domain of ''30 km'' for the entire South America and a second one of ''4.285 km'' for Córdoba mountain ranges (see the [[WRF/namelist.wps]])
   
* '''geogrid''': generation of the domain
+
[[File:CDXWRF_domain_test.png|frame|50px|WRF two domain configuration 30 and 4.285 km]]
** Creation of a folder for the geogrid section
 
<pre>$ mkdir geogrid
 
$ cd geogrid</pre>
 
** Take the necessary files:
 
<pre> $ ln -s /share/WRF/WRFV3.9.1/ifort/dmpar/WPS/geogrid/geogrid.exe ./
 
$ ln -s /share/WRF/WRFV3.9.1/ifort/dmpar/WPS/geogrid/GEOGRID.TBL.ARW ./GEOGRID.TBL
 
$ cp /share/WRF/WRFV3.9.1/ifort/dmpar/WPS/geogrid/namelist.wps ./</pre>
 
** Domain configuration is done via <code>namelist.wps</code> (more information at: [http://www2.mmm.ucar.edu/wrf/users/docs/user_guide_V3/users_guide_chap3.htm#nml WPS user guide])
 
<pre>vim namelist.wps</pre>
 
** Once it is defined, run it:
 
<pre>./geogrid.exe >& run_geogrid.log</pre>
 
** It will create the domain files, one for each domain
 
<pre>geo_em.d[nn].nc</pre>
 
** Some variables:
 
*** <code>LANDMASK</code>: sea-land mask
 
*** <code>XLAT_M</code>: latitude on mass point
 
*** <code>XLONG_M</code>: longitude on mass point
 
*** <code>HGT_M</code>: orographical height
 
** At <code>hydra</code> the domain for SESA at ''20 km'' for <code>WRFsensSFC</code> is located at
 
<pre>home/lluis.fita/estudios/dominios/WRFsensSESA20k</pre>
 
   
* '''ungrib''': unpack grib files
+
=== geogrid ===
** Creation of the folder (from <code>$WORKDIR</code>)
+
<pre>$ mkdir ungrib
+
It is used to generate the domain
$ cd ungrib</pre>
+
** Linking necessary files from compiled source
+
* Create a folder for the geogrid section
<pre>$ ln -s /share/WRF/WRFV3.9.1/ifort/dmpar/WPS/ungrib/ungrib.exe ./
+
<pre style="shell">
$ ln -s /share/WRF/WRFV3.9.1/ifort/dmpar/WPS/link_grib.csh ./</pre>
+
$ cd $WORKDIR
** Creation of a folder for the necessary GRIB files and linking the necessary files (4 files per month)
+
$ mkdir geogrid
<pre>$ mkdir GribDir
+
$ cd geogrid
  +
</pre>
  +
  +
* Link and copy all the necessary files (do not forget to add <code>opt_geogrid_tbl_path = './'</code> into <code>&geogrid</code> section) in the namelist!!):
  +
<pre style="shell">
  +
$ ln -s $WRFversion/WPS/geogrid/geogrid.exe ./
  +
$ ln -s $WRFversion/WPS/geogrid/GEOGRID.TBL.ARW ./GEOGRID.TBL
  +
$ cp $WRFversion/WPS/geogrid/namelist.wps ./
  +
</pre>
  +
  +
* Domain configuration is done via <code>namelist.wps</code> (more information at: [https://www2.mmm.ucar.edu/wrf/users/docs/user_guide_V3/user_guide_V3.9/users_guide_chap3.html WPS user guide])
  +
<pre style="shell">
  +
$ vim namelist.wps
  +
</pre>
  +
  +
In hydra, the path to the <code>geographic data</code>a has to be defined in the <code>namelist.wps</code> as follows:
  +
<pre style="shell">
  +
geog_data_path = '/share/GEOG/'
  +
</pre>
  +
  +
Once it is defined, run it (all the necessary <code>PBS script files</code> are available in <code>/share/WRF</code>, you only need to change amount of processes and users's email):
  +
<pre style="shell">
  +
$ cp /share/WRF/launch_geogrid_intel.pbs ./
  +
$ qsub launch_geogrid_intel.pbs
  +
</pre>
  +
  +
* This will create the domain files, one for each domain.
  +
<pre style="shell">
  +
geo_em.d[nn].nc
  +
</pre>
  +
  +
* Some variables from the geogrid files:
  +
** <code>LANDMASK</code>: sea-land mask
  +
** <code>XLAT_M</code>: latitude on mass point
  +
** <code>XLONG_M</code>: longitude on mass point
  +
** <code>HGT_M</code>: orographical height
  +
  +
=== ungrib ===
  +
Prepare and unpack grib files from the GCM forcing
  +
  +
<B>NOTE:</B> In case atmospheric forcing data is not in GRIB format, and it is in standard netCDF format, user will need to use the [https://gitlab.in2p3.fr/ipsl/lmd/intro/regipsl/regipsl/-/wikis/Tools/nc2ps nc2wps] tool. Which is already compiled in: <code>/share/tools/RegIPSL/tools/nc2wps</code>.
  +
  +
* Creation of the folder (from <code>$WORKDIR</code>)
  +
  +
<pre style="shell">
  +
$ mkdir ungrib
  +
$ cd ungrib
  +
</pre>
  +
  +
* Linking necessary files from compiled source
  +
<pre stye="shell">
  +
$ ln -s $WRFversion/WPS/ungrib/ungrib.exe ./
  +
$ ln -s $WRFversion/WPS/link_grib.csh ./
  +
$ cp ../geogrid/namelist.wps ./
  +
</pre>
  +
  +
* Edit the namelist <code>namelist.wps</code> to set up the right period of simulation and frequency of forcing (you can also use nano, emacs, ...)
  +
<pre style="shell">
  +
$ vim namelist.wps
  +
</pre>
  +
  +
* Creation of a folder for the necessary GRIB files and linking the necessary files (eg. 4 files per month) from a folder with all the necesary data called $inDATA
  +
<pre style="shell">
  +
$ mkdir GribDir
 
$ cd GribDir
 
$ cd GribDir
$ ln -s /share/DATA/re-analysis/ERA-Interim/*201212*.grib ./
+
$ ln -s $inDATA/*201212*.grib ./
$ cd ..</pre>
+
$ cd ..
** Re-link files for WRF with its own script
+
</pre>
<pre>./link_grib.csh GribDir/*</pre>
+
** Should appear:
+
* Re-link files for WRF with its own script
<pre>$ ls GRIBFILE.AA*
+
<pre style="shell">
GRIBFILE.AAA\ GRIBFILE.AAB GRIBFILE.AAC GRIBFILE.AAD</pre>
+
./link_grib.csh GribDir/*
** We need to provide equivalences of the GRIB codes to the real variables. WRF comes with already defined GRIB equivalencies from different sources in folder <code>Variable_Tables</code>. In our case we use ECMWF ERA-Interim at pressure levels, thus we link
+
</pre>
<pre>ln -s /share/WRF/WRFV3.9.1/ifort/dmpar/WPS/ungrib/Variable_Tables/Vtable.ERA-interim.pl ./Vtable</pre>
+
** We need to take the domain file and get the right dates
+
* Should appear:
<pre>$ cp ../geogrid/nameslist.wps ./</pre>
+
<pre style="shell">
** Files can be unpacked
+
$ ls GRIBFILE.AA*
<pre>./ungrib.exe >& run_ungrib.log</pre>
+
GRIBFILE.AAA\ GRIBFILE.AAB GRIBFILE.AAC GRIBFILE.AAD
** If everything went fine, should appear:
+
</pre>
  +
  +
* We need to provide equivalences of the GRIB codes to the real variables. WRF comes with already defined GRIB equivalencies from different sources in folder <code>Variable_Tables</code>. In this example we use ECMWF ERA-Interim at pressure levels, thus we link
  +
<pre style="shell">
  +
$ ln -s $WRFversion/WPS/ungrib/Variable_Tables/Vtable.ERA-interim.pl ./Vtable
  +
</pre>
  +
  +
* We need to take the domain file and get the right dates
  +
<pre style="shell">
  +
$ cp ../geogrid/nameslist.wps ./
  +
</pre>
  +
  +
* Files can be unpacked using the PBS job file (user's email has to be changed)
  +
<pre style="shell">
  +
$ cp /share/WRF/launch_ungrib_intel.pbs ./
  +
$ qsub launch_ungrib_intel.pbs
  +
</pre>
  +
  +
* If everything went fine, should appear:
 
<pre>FILE:[YYYY]-[MM]-[DD]_[HH]</pre>
 
<pre>FILE:[YYYY]-[MM]-[DD]_[HH]</pre>
 
* And...
 
* And...
Línea 145: Línea 148:
 
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
 
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
 
! Successful completion of ungrib. !
 
! Successful completion of ungrib. !
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!</pre>
+
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
  +
</pre>
  +
  +
  +
=== metgrid ===
  +
Horizontal interpolation of atmospheric forcing data at the domain of simulation
  +
  +
* Creation of the folder (from <code>$WORKDIR</code>)
  +
  +
<pre style="shell">
  +
$ mkdir metgrid
  +
$ cd metgrid
  +
</pre>
  +
  +
* Getting necessary files (do not forget <code> opt_metgrid_tbl_path = './'</code>, into <code>&metgrid</code> section in namelist !! to avoid: <code>ERROR: Could not open file METGRID.TBL</code>)
  +
  +
<pre style="shell">
  +
$ ln -s $WRFversion/WPS/metgrid/metgrid.exe ./
  +
$ ln -s $WRFversion/WPS/metgrid/METGRID.TBL.ARW ./METGRID.TBL
  +
</pre>
  +
  +
* Getting the <code>ungrib</code> output
  +
<pre style="shell">
  +
$ ln -s ../ungrib/FILE* ./
  +
</pre>
   
* '''metgrid''' horizontal interpolation of atmospheric forcing data at the domain of simulation
 
** Creation of the folder (from <code>$WORKDIR</code>)
 
<pre>$ mkdir metgrid
 
$ cd metgrid</pre>
 
** Getting necessary files
 
<pre>$ ln -s /share/WRF/WRFV3.9.1/ifort/dmpar/WPS/metgrid/metgrid.exe ./
 
$ ln -s /share/WRF/WRFV3.9.1/ifort/dmpar/WPS/metgrid/METGRID.TBL.ARW ./METGRID.TBL</pre>
 
** Getting the <code>ungrib</code> output
 
<pre>ln -s ../ungrib/FILE* ./</pre>
 
 
** Link the domains of simulation
 
** Link the domains of simulation
<pre>$ ln -s ../geogrid/geo_em.d* ./</pre>
+
<pre style="shell">
  +
$ ln -s ../geogrid/geo_em.d* ./
  +
</pre>
  +
 
** Link the namelist from <code>ungrib</code> (to make sure we are using the same!)
 
** Link the namelist from <code>ungrib</code> (to make sure we are using the same!)
<pre>$ ln -s ../ungrib/namelist.wps ./</pre>
+
<pre style="shell">
** Get the PBS (job queue script) to run the <code>metrid.exe</code>
+
$ ln -s ../ungrib/namelist.wps ./
<pre>$ cp /share/WRF/run_metgrid.pbs ./</pre>
+
</pre>
** And run it
+
<pre>qsub run_metgrid.pbs</pre>
+
* Get the PBS (job queue script) to run the <code>metrid.exe</code> (remember to edit the user's email in the pbs job)
** If everything went fine one should have
+
<pre style="shell">
<pre>met_em.d[nn].[YYYY]-[MM]-[DD]_[HH]:[MI]:[SS].nc</pre>
+
$ cp /share/WRF/launch_metgrid_intel.pbs ./
** And...
+
$ qsub launch_metgrid_intel.pbs
<pre>$ tail run_metgrid.log
+
</pre>
  +
  +
* If everything went fine one should have
  +
<pre>
  +
met_em.d[nn].[YYYY]-[MM]-[DD]_[HH]:[MI]:[SS].nc
  +
</pre>
  +
  +
* And...
  +
<pre style="shell">
  +
$ tail run_metgrid.log
 
(...)
 
(...)
 
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
 
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
 
! Successful completion of metgrid. !
 
! Successful completion of metgrid. !
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!</pre>
+
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
  +
</pre>
  +
  +
  +
=== real ===
  +
Vertical interpolation of atmospheric forcing at the domain of simulation
  +
  +
* Creation of the folder (from <code>$WORKDIR</code>)
  +
<pre style="shell">
  +
$ mkdir run
  +
$ cd run
  +
</pre>
  +
  +
* Link all the necessary files from WRF (it already links all the necessary to run the model)
  +
<pre style="shell">
  +
$ ln -s $WRFversion/WRF/run/* ./
  +
</pre>
  +
  +
* Remove and copy the configuration file (<code>namelist.input</code>)
  +
<pre style="shell">
  +
$ rm namelist.input
  +
$ cp $WRFversion/WRF/run/namelist.input ./
  +
</pre>
  +
  +
* Edit the file and prepare the configuration for the run (re-adapt domain, physics, dates, output....). See an example for the two nested domain here [[WRF/namelist.input]]
  +
  +
<pre style="shell">
  +
$ vim namelist.input
  +
</pre>
  +
  +
  +
<B>NOTE:</B> If you are using the LCZ classification scheme, the following changes must be done in the <code>namelist.input</code> to properly simulate the domains with the data.
  +
<pre style="shell">
  +
&physics
  +
use_wudapt_lcz = 1
  +
num_land_cat = 61 #If you chose to run geogrid directly from WPS v4.5
  +
num_land_cat = 41 #If you chose to use w2w
  +
</pre>
  +
When using LCZs, it is recommended to use the urban parameterization options BEP or BEP+BEM (<code>sf_urban_physics = 2 or 3</code>, respectively). In case you use SLUCM model (<code>sf_urban_physics = 1</code>) the lowest model level has to be above the highest building height.
  +
  +
  +
* Linking the <code>metgrid</code> generated files
  +
<pre style="shell">
  +
$ ln -s ../metgrid/met_em.d*.nc ./
  +
</pre>
  +
  +
* Getting the PBS job script for <code>real.exe</code>
   
* '''real''' vertical interpolation of atmospheric forcing at the domain of simulation
 
** Creation of the folder (from <code>$WORKDIR</code>)
 
<pre>$ mkdir run
 
$ cd run</pre>
 
** Link all the necessary files from WRF (it already links all the necessary to run the model)
 
<pre>$ ln -s /share/WRF/WRFV3.9.1/ifort/dmpar/WRFV3/run/* ./</pre>
 
** Remove and copy the configuration file (<code>namelist.input</code>)
 
<pre>$ rm namelist.input
 
$ cp /share/WRF/WRFV3.9.1/ifort/dmpar/WRFV3/run/namelist.input ./</pre>
 
** Edit the file and prepare the configuration for the run (re-adapt domain, physics, dates, output....)
 
<pre>$ vim namelist.input</pre>
 
** Linking the <code>metgrid</code> generated files
 
<pre>ln -s ../metgrid/met_em.d*.nc ./</pre>
 
** Getting the PBS job script for <code>real.exe</code>
 
 
<pre>$ cp /share/WRF/run_real.pbs ./</pre>
 
<pre>$ cp /share/WRF/run_real.pbs ./</pre>
** And run it
+
* Getting a bash script to run executables in hydra (there is an issue with <code>ulimit</code> which does not allow to run with more than one node)
<pre>qsub run_real.pbs</pre>
+
<pre style="shell">
** If everything went fine one should have (the basic ones):
+
$ cp /share/WRF/launch_real_intel.pbs ./
<pre>wrfbdy_d[nn] wrfinput_d[nn] ...</pre>
+
$ qsub /share/WRF/launch_pbs.bash
*** <code>wrfbdy_d01</code>: Boundary conditions file (only for the first domain)
+
</pre>
*** <code>wrfinput_d[nn]</code>: Initial conditions file for each domain
+
*** <code>wrffdda_d[nn]</code>: Nudging file [optional]
+
* If everything went fine one should have (the basic ones):
*** <code>wrflowinp_d[nn]</code>: File with updating (every time-step of the atmospheric forcing) surface characteristics [optional]
+
<pre>
  +
wrfbdy_d[nn] wrfinput_d[nn] ...
  +
</pre>
  +
  +
** <code>wrfbdy_d01</code>: Boundary conditions file (only for the first domain)
  +
** <code>wrfinput_d[nn]</code>: Initial conditions file for each domain
  +
** <code>wrffdda_d[nn]</code>: Nudging file [optional]
  +
** <code>wrflowinp_d[nn]</code>: File with updating (every time-step of the atmospheric forcing) surface characteristics [optional]
 
** And...
 
** And...
<pre>$ tail realout/[InitialDATE]-[EndDATE]/rsl.error.0000
+
<pre style="shell">$ tail outreal/rsl.error.0000
 
(...)
 
(...)
real_em: SUCCESS COMPLETE REAL_EM INIT</pre>
+
real_em: SUCCESS COMPLETE REAL_EM INIT
  +
</pre>
  +
  +
== WRF ==
  +
* Simulation (look on [http://www2.mmm.ucar.edu/wrf/users/docs/user_guide_V3/users_guide_chap5.htm\#_Description_of_Namelist description of namelist] for namelist configuration/specifications).
  +
  +
* Getting the necessary PBS job (same folder for <code>real</code>)
  +
<pre style="shell">$ cp /share/WRF/launch_wrf_intel.pbs ./
  +
qsub launch_wrf_intel.pbs
  +
</pre>
  +
  +
* If everything went fine one should have (the basic ones):
  +
  +
<pre>outwrf/wrfout_d[nn]_[YYYY]-[MM]-[DD]_[HH]:[MI]:[SS] outwrf/wrfrst_d[nn]_[YYYY]-[MM]-[DD]_[HH]:[MI]:[SS] ...
  +
</pre>
  +
  +
** <code>wrfout/wrfout_d[nn]_[YYYY]-[MM]-[DD]_[HH]:[MI]:[SS]</code>: simulation output (at &eta;=(p-p<sub>top</sub>)/(p<sub>sfc</sub>-p<sub>top</sub>) levels)
  +
** <code>wrfout/wrfrst_d[nn]_[YYYY]-[MM]-[DD]_[HH]:[MI]:[SS]</code>: restart file (to continue simulation)
  +
** <code>wrfout/wrfxtrm_d[nn]_[YYYY]-[MM]-[DD]_[HH]:[MI]:[SS]</code>: file with extremes from internal integration [optional]
  +
** <code>wrfout/wrfpress_d[nn]_[YYYY]-[MM]-[DD]_[HH]:[MI]:[SS]</code>: file at vertical pressure levels [optional]
  +
** <code>namelist.output_[YYYY]-[MM]-[DD]_[HH]:[MI]:[SS]</code>: All the parameters used for the simulation
  +
** <code>stations</code>: folder with the time-series files (<code>tslist</code>)
  +
* And...
  +
<pre style="shell">
  +
tail outwrf/rsl.error.0000
  +
(...)
  +
d01 [YYYY]-[MM]-[DD]_[HH]:[MI]:[SS] wrf: SUCCESS COMPLETE WRF
  +
</pre>
   
* '''wrf''' Simulation (look on [http://www2.mmm.ucar.edu/wrf/users/docs/user_guide_V3/users_guide_chap5.htm\#_Description_of_Namelist description of namelist] for namelist configuration/specifications)
+
* While ''running'' one can check the status on regarding on the <code>rsl.error.0000</code> file, e.g.:
** Getting the necessary PBS job (same folder for <code>real</code>)
 
<pre>$ cp /share/WRF/run_wrf.pbs ./</pre>
 
** And run it
 
<pre>qsub run_wrf.pbs</pre>
 
** If everything went fine one should have (the basic ones):
 
<pre>wrfout_d[nn]_[YYYY]-[MM]-[DD]_[HH]:[MI]:[SS] wrfrst_d[nn]_[YYYY]-[MM]-[DD]_[HH]:[MI]:[SS] ...</pre>
 
*** <code>wrfout/wrfout_d[nn]_[YYYY]-[MM]-[DD]_[HH]:[MI]:[SS]</code>: simulation output (at &eta;=(p-p<sub>top</sub>)/(p<sub>sfc</sub>-p<sub>top</sub>) levels)
 
*** <code>wrfout/wrfrst_d[nn]_[YYYY]-[MM]-[DD]_[HH]:[MI]:[SS]</code>: restart file (to continue simulation)
 
*** <code>wrfout/wrfxtrm_d[nn]_[YYYY]-[MM]-[DD]_[HH]:[MI]:[SS]</code>: file with extremes from internal integration [optional]
 
*** <code>wrfout/wrfpress_d[nn]_[YYYY]-[MM]-[DD]_[HH]:[MI]:[SS]</code>: file at vertical pressure levels [optional]
 
*** <code>namelist.output_[YYYY]-[MM]-[DD]_[HH]:[MI]:[SS]</code>: All the parameters used for the simulation
 
*** <code>stations</code>: folder with the time-series files (<code>tslist</code>)
 
** While ''running'' one can check the status on regarding on the <code>rsl.error.0000</code> file, e.g.:
 
 
<pre>$ tail rsl.error.0000
 
<pre>$ tail rsl.error.0000
 
(...)
 
(...)
Timing for main: time 2012-12-01_00:13:30 on domain 1: 3.86105 elapsed seconds</pre>
+
Timing for main: time [YYYY]-[MM]-[DD]_[HH]:[MI]:[SS] on domain 1: 3.86105 elapsed seconds
** But if something went ''wrong'':
+
</pre>
*** CFL: At <code>wrfout/[InitialDATE]-[EndDATE]/</code> there are the files <code>rsl.[error/out].[nnnn]</code> (two per cpu). Use to appear <code>SIGSEV segmentation fault</code>, and to look different sources, usually <code>cfl</code>. e.g. (usually look for the largest <code>rsl</code> file (after the <code>0000</code>); [<code>$ ls -rS rsl.error.*</code>])
+
  +
= WRF: known errors =
  +
  +
== CFL ==
  +
* But if something went ''wrong'':
  +
** [https://en.wikipedia.org/wiki/Courant%E2%80%93Friedrichs%E2%80%93Lewy_condition CFL]: At <code>wrfout/[InitialDATE]-[EndDATE]/</code> there are the files <code>rsl.[error/out].[nnnn]</code> (two per cpu). Use to appear <code>SIGSEV segmentation fault</code>, and to look different sources, usually <code>cfl</code>. e.g. (usually look for the largest <code>rsl</code> file (after the <code>0000</code>); [<code>$ ls -rS rsl.error.*</code>])
 
<pre>$ wrfout/20121201000000-20121210000000/rsl.error.0009
 
<pre>$ wrfout/20121201000000-20121210000000/rsl.error.0009
 
(...)
 
(...)
Línea 212: Línea 192:
 
d01 2012-12-01_01:30:00 MAX AT i,j,k: 100 94 21 vert_cfl,w,d(eta)= 5.897325 17.00738 3.2057911E-02
 
d01 2012-12-01_01:30:00 MAX AT i,j,k: 100 94 21 vert_cfl,w,d(eta)= 5.897325 17.00738 3.2057911E-02
 
(...)</pre>
 
(...)</pre>
  +
  +
== NO working restart ==
  +
Since a given version there is a need to include a new namelist parameter (in <code>&time_control</code> section) [[http://www2.mmm.ucar.edu/wrf/users/docs/user_guide_V3/users_guide_chap5.htm#restart WRF restart]] in order to make available the option to continue a simulation from a given restart
  +
<PRE>
  +
override_restart_timers = .true.
  +
</PRE>
  +
  +
If we want to get values at the output files at the time of the restart, you need to add at the (<code>&time_control</code> section) the parameter
  +
<PRE>
  +
write_hist_at_0h_rst = .true.
  +
</PRE>
   
 
= Additional information =
 
= Additional information =
 
For more additional information and further details visit [[WRFextras]]
 
For more additional information and further details visit [[WRFextras]]
  +
  +
= WRF4L: Llu&iacute;s' WRF work-flow management =
  +
For information about Llu&iacute;'s WRF work-flow management visit here [[WRF4L]]
  +
  +
= WRF URBAN =
  +
For information about urban simulation using WRF 4.5.0 or higher visit here [[WRF-URBAN]]
  +
  +
= CDXWRF: WRF for CORDEX =
  +
A new module developed in CIMA to attain CORDEX variable demands visit here [[CDXWRF]]
  +
  +
= WRFles =
  +
LES simulations with WRF model here [[WRFles]]

Última revisión de 11:14 19 ene 2024

Contenido

[editar] Model description

Weather Research and Forecast model (WRF, Skamarock, 2008) is a limited area atmospheric model developed by a consortium of american institutions with contributions from the whole community. It is a non-hydrostatic primitive equations model used in a large variety of research areas.

Techincal aspects of the model (v 4.0) are provided in this pdf

[editar] WRF and CIMA

CIMA has an HPC called hydra to perform high intensive modelling efforts with WRF.

[editar] hydra

The HPC is part of the Servicio Nacional de Cómputo de Alta Capacidad (SNCAD).

It has been re-installed and re-sorted in Jan 2022 and it has been adapted to common configurations in other international research centers.


[editar] Selection of compilers & libraries

It does not have the module software to manage compilers and libraries, but it does have an script which allows users to select the compiler. This is done at the terminal with the script source /opt/load-libs.sh

Example selecting intel 2021.4.0, mpich and netcdf 4

$ source /opt/load-libs.sh

Available libraries:
1) INTEL 2021.4.0: MPICH 3.4.2, NetCDF 4, HDF5 1.10.5, JASPER 2.0.33
2) INTEL 2021.4.0: OpenMPI 4.1.2, NetCDF 4, HDF5 1.10.5, JASPER 2.0.33
3) GNU 10.2.1: MPICH 3.4.2, NetCDF 4, HDF5 1.10.5, JASPER 2.0.33
4) GNU 10.2.1: OpenMPI 4.1.2, NetCDF 4, HDF5 1.10.5, JASPER 2.0.33
0) Exit
Choose an option: 1

The following libraries, compiled with Intel 2021.4.0 compilers, were loaded:
* MPICH 3.4.2
* NetCDF 4
* HDF5 1.10.5
* JASPER 2.0.33

To change it please logout and login again.

To load this libraries from within a script add this line to script:
source /opt/load-libs.sh 1


[editar] Compilation

Multiple different versions are available and pre-compiled in hydra, with the folloing structure:

/opt/wrf/WRF-{version}/{compiler}/{compiler-version}/{architecture}/{WRF/WPS}


[editar] Forcings

[editar] Atmospheric forcings

This will change once papa-deimos is fully operational

Provide the atmospheric conditions to the model at a given date.

  • There is a shared space called /share
  • At hydra all forcings are at:
/share/DATA/
/share/DATA/re-analysis/ERA-Interim/
    • Global monthly files at 0.75° horizontal resolution and all time-steps: 00, 06, 12, 18 are labelled as:
      • ERAI_pl[YYYY][MM]_[var1]-[var2].grib: pressure levels variables (all levels). GRIB codes as:
        • 129: geopotential
        • 157: relative humidty
        • 130: Temperature
        • 131: u-wind
        • 132: v-wind
      • ERAI_sfc[YYYY][MM].grib: all surface levels variables (step 0)
    • To download data:
      • Generate files from ECMWF web-page ERA-Interim
      • Go to the folder in hydra
cd /share/DATA/re-analysis/ERA-Interim/
      • get the file (as link from ECMWF web-page (right bottom on `Download grib'), as e.g.:
$ wget https://stream.ecmwf.int/data/atls05/data/data02/scratch/_mars-atls05-a82bacafb5c306db76464bc7e824bb75-zn7P44.grib
      • Rename file according to its content
$ mv _mars-atls05-a82bacafb5c306db76464bc7e824bb75-zn7P44.grib ERAI_sfc201302.grib
  • ERA5
    • Thus, part of ERA5 forcings are:
/share/DATA/re-analysis/ERA5/
  • 'NCEP-NNRP1'
    • Thus, full NCEP-NNRP1 forcings are (NOTE: no land data!!!):
/share/DATA/re-analysis/NCEP_NNRP/


[editar] Morphological forcings

Provide the geomorphological information for the domain of simulation: topography, land-use, vegetation-types, etc...

  • In WRF there is a huge amount of data ans sources at different resolutions. At hydra everything is already there at:
/share/GEOG/
  • They are ready to be use

[editar] GENERIC Model use

WRF has two main parts:

  • WPS: Generation of the domain, initial and boundary condition: Runs 4 programs:
    • geogrid: domain generation
    • ungrib: unpack atmospheric forcing \verb+grib+ files
    • metgrid: horizontal interpolation of the unpacked atmospheric forcing files at the domain of simulation
    • real: generation of the initial and boundary conditions using \verb+metgrid+ output
  • WRF: model it self: wrf.exe

At hydra all the code is already compiled at /opt/wrf (with folders WPS and WRFV3) As example WRFv4.3.3 compiled with intel-compilers with distributed and shared memory

/opt/wrf/WRF-4.3.3/intel/2021.4.0/dm+sm/WRF

[editar] WPS

Let's assume that we work in a folder called $WORKDIR (at users' ${HOME} at hydra) and we will work with a given WRF version located in $WRFversion. As example, let's create a two nested domain of 30 km for the entire South America and a second one of 4.285 km for Córdoba mountain ranges (see the WRF/namelist.wps)

(thumbnail)
WRF two domain configuration 30 and 4.285 km

[editar] geogrid

It is used to generate the domain

  • Create a folder for the geogrid section
$ cd $WORKDIR
$ mkdir geogrid
$ cd geogrid
  • Link and copy all the necessary files (do not forget to add opt_geogrid_tbl_path = './' into &geogrid section) in the namelist!!):
$ ln -s $WRFversion/WPS/geogrid/geogrid.exe ./
$ ln -s $WRFversion/WPS/geogrid/GEOGRID.TBL.ARW ./GEOGRID.TBL
$ cp $WRFversion/WPS/geogrid/namelist.wps ./
  • Domain configuration is done via namelist.wps (more information at: WPS user guide)
$ vim namelist.wps

In hydra, the path to the geographic dataa has to be defined in the namelist.wps as follows:

 geog_data_path = '/share/GEOG/'

Once it is defined, run it (all the necessary PBS script files are available in /share/WRF, you only need to change amount of processes and users's email):

$ cp /share/WRF/launch_geogrid_intel.pbs ./
$ qsub launch_geogrid_intel.pbs
  • This will create the domain files, one for each domain.
geo_em.d[nn].nc
  • Some variables from the geogrid files:
    • LANDMASK: sea-land mask
    • XLAT_M: latitude on mass point
    • XLONG_M: longitude on mass point
    • HGT_M: orographical height

[editar] ungrib

Prepare and unpack grib files from the GCM forcing

NOTE: In case atmospheric forcing data is not in GRIB format, and it is in standard netCDF format, user will need to use the nc2wps tool. Which is already compiled in: /share/tools/RegIPSL/tools/nc2wps.

  • Creation of the folder (from $WORKDIR)
$ mkdir ungrib 
$ cd ungrib
  • Linking necessary files from compiled source
$ ln -s $WRFversion/WPS/ungrib/ungrib.exe ./
$ ln -s $WRFversion/WPS/link_grib.csh ./
$ cp ../geogrid/namelist.wps ./
  • Edit the namelist namelist.wps to set up the right period of simulation and frequency of forcing (you can also use nano, emacs, ...)
$ vim namelist.wps
  • Creation of a folder for the necessary GRIB files and linking the necessary files (eg. 4 files per month) from a folder with all the necesary data called $inDATA
$ mkdir GribDir
$ cd GribDir
$ ln -s $inDATA/*201212*.grib ./ 
$ cd ..
  • Re-link files for WRF with its own script
./link_grib.csh GribDir/*
  • Should appear:
$ ls GRIBFILE.AA*
GRIBFILE.AAA\  GRIBFILE.AAB  GRIBFILE.AAC  GRIBFILE.AAD
  • We need to provide equivalences of the GRIB codes to the real variables. WRF comes with already defined GRIB equivalencies from different sources in folder Variable_Tables. In this example we use ECMWF ERA-Interim at pressure levels, thus we link
$ ln -s $WRFversion/WPS/ungrib/Variable_Tables/Vtable.ERA-interim.pl ./Vtable
  • We need to take the domain file and get the right dates
$ cp ../geogrid/nameslist.wps ./
  • Files can be unpacked using the PBS job file (user's email has to be changed)
$ cp /share/WRF/launch_ungrib_intel.pbs ./
$ qsub launch_ungrib_intel.pbs
  • If everything went fine, should appear:
FILE:[YYYY]-[MM]-[DD]_[HH]
  • And...
$ tail run_ungrib.log 
(...) 
********** 
Done deleting temporary files. 
********** 

!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! 
!  Successful completion of ungrib.   ! 
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!


[editar] metgrid

Horizontal interpolation of atmospheric forcing data at the domain of simulation

  • Creation of the folder (from $WORKDIR)
$ mkdir metgrid 
$ cd metgrid
  • Getting necessary files (do not forget opt_metgrid_tbl_path = './', into &metgrid section in namelist !! to avoid: ERROR: Could not open file METGRID.TBL)
$ ln -s $WRFversion/WPS/metgrid/metgrid.exe ./ 
$ ln -s $WRFversion/WPS/metgrid/METGRID.TBL.ARW ./METGRID.TBL
  • Getting the ungrib output
$ ln -s ../ungrib/FILE* ./
    • Link the domains of simulation
$ ln -s ../geogrid/geo_em.d* ./
    • Link the namelist from ungrib (to make sure we are using the same!)
$ ln -s ../ungrib/namelist.wps ./
  • Get the PBS (job queue script) to run the metrid.exe (remember to edit the user's email in the pbs job)
$ cp /share/WRF/launch_metgrid_intel.pbs ./
$ qsub launch_metgrid_intel.pbs
  • If everything went fine one should have
met_em.d[nn].[YYYY]-[MM]-[DD]_[HH]:[MI]:[SS].nc
  • And...
$ tail run_metgrid.log
(...)
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! 
!  Successful completion of metgrid.  ! 
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!


[editar] real

Vertical interpolation of atmospheric forcing at the domain of simulation

  • Creation of the folder (from $WORKDIR)
$ mkdir run 
$ cd run
  • Link all the necessary files from WRF (it already links all the necessary to run the model)
$ ln -s $WRFversion/WRF/run/* ./
  • Remove and copy the configuration file (namelist.input)
$ rm namelist.input 
$ cp $WRFversion/WRF/run/namelist.input ./
  • Edit the file and prepare the configuration for the run (re-adapt domain, physics, dates, output....). See an example for the two nested domain here WRF/namelist.input
$ vim namelist.input


NOTE: If you are using the LCZ classification scheme, the following changes must be done in the namelist.input to properly simulate the domains with the data.

&physics
use_wudapt_lcz = 1
num_land_cat = 61 #If you chose to run geogrid directly from WPS v4.5 
num_land_cat = 41 #If you chose to use w2w

When using LCZs, it is recommended to use the urban parameterization options BEP or BEP+BEM (sf_urban_physics = 2 or 3, respectively). In case you use SLUCM model (sf_urban_physics = 1) the lowest model level has to be above the highest building height.


  • Linking the metgrid generated files
$ ln -s ../metgrid/met_em.d*.nc ./
  • Getting the PBS job script for real.exe
$ cp /share/WRF/run_real.pbs ./
  • Getting a bash script to run executables in hydra (there is an issue with ulimit which does not allow to run with more than one node)
$ cp /share/WRF/launch_real_intel.pbs ./
$ qsub /share/WRF/launch_pbs.bash
  • If everything went fine one should have (the basic ones):
wrfbdy_d[nn]  wrfinput_d[nn]  ...
    • wrfbdy_d01: Boundary conditions file (only for the first domain)
    • wrfinput_d[nn]: Initial conditions file for each domain
    • wrffdda_d[nn]: Nudging file [optional]
    • wrflowinp_d[nn]: File with updating (every time-step of the atmospheric forcing) surface characteristics [optional]
    • And...
$ tail outreal/rsl.error.0000
(...)
real_em: SUCCESS COMPLETE REAL_EM INIT

[editar] WRF

  • Getting the necessary PBS job (same folder for real)
$ cp /share/WRF/launch_wrf_intel.pbs ./
qsub launch_wrf_intel.pbs
  • If everything went fine one should have (the basic ones):
outwrf/wrfout_d[nn]_[YYYY]-[MM]-[DD]_[HH]:[MI]:[SS] outwrf/wrfrst_d[nn]_[YYYY]-[MM]-[DD]_[HH]:[MI]:[SS]  ...
    • wrfout/wrfout_d[nn]_[YYYY]-[MM]-[DD]_[HH]:[MI]:[SS]: simulation output (at η=(p-ptop)/(psfc-ptop) levels)
    • wrfout/wrfrst_d[nn]_[YYYY]-[MM]-[DD]_[HH]:[MI]:[SS]: restart file (to continue simulation)
    • wrfout/wrfxtrm_d[nn]_[YYYY]-[MM]-[DD]_[HH]:[MI]:[SS]: file with extremes from internal integration [optional]
    • wrfout/wrfpress_d[nn]_[YYYY]-[MM]-[DD]_[HH]:[MI]:[SS]: file at vertical pressure levels [optional]
    • namelist.output_[YYYY]-[MM]-[DD]_[HH]:[MI]:[SS]: All the parameters used for the simulation
    • stations: folder with the time-series files (tslist)
  • And...
tail outwrf/rsl.error.0000
(...)
d01 [YYYY]-[MM]-[DD]_[HH]:[MI]:[SS] wrf: SUCCESS COMPLETE WRF
  • While running one can check the status on regarding on the rsl.error.0000 file, e.g.:
$ tail rsl.error.0000
(...)
Timing for main: time [YYYY]-[MM]-[DD]_[HH]:[MI]:[SS] on domain  1:  3.86105 elapsed seconds

[editar] WRF: known errors

[editar] CFL

  • But if something went wrong:
    • CFL: At wrfout/[InitialDATE]-[EndDATE]/ there are the files rsl.[error/out].[nnnn] (two per cpu). Use to appear SIGSEV segmentation fault, and to look different sources, usually cfl. e.g. (usually look for the largest rsl file (after the 0000); [$ ls -rS rsl.error.*])
$ wrfout/20121201000000-20121210000000/rsl.error.0009
(...) 
d01 2012-12-01_01:30:00  33 points exceeded cfl=2 in domain d01 at time 2012-12-01_01:30:00 hours 
d01 2012-12-01_01:30:00  MAX AT i,j,k:    100    94    21 vert_cfl,w,d(eta)=    5.897325    17.00738    3.2057911E-02
(...)

[editar] NO working restart

Since a given version there is a need to include a new namelist parameter (in &time_control section) [WRF restart] in order to make available the option to continue a simulation from a given restart

override_restart_timers                = .true.

If we want to get values at the output files at the time of the restart, you need to add at the (&time_control section) the parameter

write_hist_at_0h_rst                   = .true.

[editar] Additional information

For more additional information and further details visit WRFextras

[editar] WRF4L: Lluís' WRF work-flow management

For information about Lluí's WRF work-flow management visit here WRF4L

[editar] WRF URBAN

For information about urban simulation using WRF 4.5.0 or higher visit here WRF-URBAN

[editar] CDXWRF: WRF for CORDEX

A new module developed in CIMA to attain CORDEX variable demands visit here CDXWRF

[editar] WRFles

LES simulations with WRF model here WRFles

Herramientas personales