WRF-Fire Online Tutorial

Main content

Contents

Overview

WRF-Fire is a physics module within the Weather Research and Forecasting Advanced Research model that allows users to simulate the growth of a wildland fire, while taking into account the environmental conditions of terrain slope, fuel characteristics, atmospheric conditions, and the dynamic feedback exchanges with the atmosphere.  It is implemented as a physics package with two-way coupling between the fire behavior and the atmospheric environment allowing the latent and sensible heat released by the fire to alter the surrounding atmosphere, thereby creating its own weather.

Tutorials

If you're interested in learning more, we hold virtual tutorials for WRF-Fire. To learn about upcoming tutorials check out our WRF-Fire events on our Event Page.

This tutorial covers the basic steps required to run a WRF-Fire simulation from scratch. These instructions are written for a user on the Cheyenne Supercomputer that is part of the NCAR-Wyoming Supercomputing Center (NWSC). Please email Tim Juliano (tjuliano@ucar.edu) with any questions, comments, or feedback. Happy modeling!

Github Source Code

Step 1: Downloading, configuring, and compiling WPS and WRF

1a) Downloading WPS and WRF

Let’s begin by downloading the official WPS and WRF code, which have been uploaded into the Github repository for our LEAP-HI project. Navigate to your working directory and type the following in the command line:

git clone https://github.com/NCAR/WRF-Fire-LEAPHI.git

This will take a couple of minutes. Once the files have been downloaded, you should see a sub-directory that has been created called WRF-Fire-LEAPHI. Within this directory, there should be two sub-directories called WPS and WRF

1b) Configuring and compiling WRF

Next, navigate into your WRF directory. We will start by creating a new branch from the master branch:

git checkout -b c34_test_case origin/master

Here, I am naming my branch c34_test_case, but feel free to name your branch whatever you like.

Now we can configure WRF by typing the following in the command line:

./configure

I personally use option 34 (GNU, gfortran/gcc, dmpar) with the following modules loaded:

1) ncarenv/1.3   2) gnu/8.3.0   3) ncarcompilers/0.5.0   4) netcdf/4.7.3   5) mpt/2.19

Choose option 1 (basic) when asked about Compile for nesting.

Once successfully configured, we need to compile the WRF em_real directory:

./compile em_real

This command will print the output directly to your screen. You can also redirect the compilation output to a file, e.g.:

./compile em_real >& compile_em_real.log &

This will take anywhere from about 10-60 minutes depending on your configuration option. Once successfully compiled, we want to compile the WRF em_fire directory:

./compile em_fire

This part will take only a few minutes. Once successfully compiled, navigate into the test/em_real and test/em_fire directories to ensure that files and executables have been soft linked from the main and rundirectories. Specifically, you should see the following executables in em_real: ndown.exe, real.exe, tc.exe and wrf.exe and in em_fire: ideal.exe and wrf.exe. Once you see these, then we are ready to configure and compile the pre-processing system.

1c) Configuring and compiling WPS

Now that we have successfully compiled WRF, we want to navigate into our WPS directory. Similar to how we configured and compiled WRF, we will do the same for WPS; type:

./configure

in the WPS directory. For this, I use option 3 (Linux x86_64, gfortran, dmpar), but again, this will depend upon your chosen modules. Once configured, type:

./compile

Compiling WPS will require a couple of minutes and if successful, you will see that three important files have been created: geogrid.exe, metgrid.exe, and ungrib.exe. Once you have those files, you have completed installing WRF and WPS!

Step 2: Running pre-processer (WPS)

The purpose of this step is to (i) set up our model domain and extract terrain and other static data (using program geogrid.exe), (ii) extract meteorological information from gridded data (ungrib.exe), and (iii) interpolate the meteorological data to our domain (metgrid.exe).

**A note on running WPS executables: the geogrid.exe and metgrid.exe programs should be submitted as batch jobs on the Cheyenne supercomputer. I have included example job scripts in the WPS directory: runcheyenne_geogrid.sh and runcheyenne_metgrid.sh for geogrid.exe and metgrid.exe, respectively. Be sure to set the following line to reflect your Cheyenne <project_code> to which you would like to charge core hours:

#PBS -A <project_code>

More details on submitting jobs on Cheyenne may be found here: https://www2.cisl.ucar.edu/resources/computational-systems/cheyenne/run…

To check the status of your job, in the terminal you may type

qstat -u <username>

where <username> is your Cheyenne login name.

2a) Setting up the model domain using namelist.wps

Let’s begin by setting up our model domain. We will use the namelist.wps file to tell WPS exactly what we want to do. This is simply a text file that we can edit with any text editor. For our case study of the C34 fire, I’ve created a namelist file called namelist.wps.c34; copy that file to namelist.wps:

cp namelist.wps.c34 namelist.wps

In the namelist.wps file, we specify the number of domains (we will use 2) and the start date and end date for the run. For our simulation, we will begin the run at 2100 UTC on February 13, 2019 until 0300 UTC on February 14, 2019 for a 6-hour simulation. Also specified in namelist.wps is the interval at which we will have output grids for our run, set to 3600 (units in seconds), implying output grids are available at 1-hour increments. For our simulation, the inner domain will host the fire grid, and so we must specify the subgrid_ratio_x and subgrid_ratio_y variables, which tell WPS how we want to subdivide our meteorological grid. We will use a value of 4, which means that the fire grid will have a 4x horizontal refinement (e.g., if the horizontal grid cell spacing of the meteorological grid is 100m, then that of the fire grid will be 25m). Domain information is also set in namelist.wps such as the number of grid points in the x- and y-directions, where the nests begin, grid spacing of the outer grid, and center latitude and longitude. The specifications for our run are already set.

2b) Running geogrid.exe

To run geogrid.exe, geographical data sets are required. Although you can download the complete data set, it is far better to link to an existing data set since the terrain files are quite large. The topographic files used are in the following directory on Cheyenne:

/glade/work/tjuliano/fire/unr/WPS_GEOG/

and all we need to do is to make certain we link to those files. The sample namelist.wps file that I’ve created should already have the correct path.

We also need to help geogrid understand how to process the static data for our domain. This is communicated through the GEOGRID.TBL file, which is located in the WPS/geogrid directory. Navigate into the geogrid directory and create a soft link to the GEOGRID.TBL.ARW.FIRE.CO file since we will be simulating a fire in Colorado (CO):

ln -sf GEOGRID.TBL.ARW.FIRE.CO GEOGRID.TBL

Now it is our task to define an area for our simulation. In our case, we will choose to run with an outer grid and one nest. The area we will use is centered on southeastern Colorado. This setup has been incorporated into the namelist.wps file, but feel free to modify the geographical region if you would like to experiment. To run geogrid.exe simply submit the runcheyenne_geogrid.sh batch script:

qsub runcheyenne_geogrid.sh

Information will print to the geogrid.log* files and, if successful, you will see a banner confirming “Successful completion of geogrid”. You will see two files created: geo_em.d01.nc and geo_em.d02.nc, representing the two domains for your simulation.

2c) Running ungrib.exe

The second program we need to run in the WPS directory is ungrib.exe. This is the start of the process by which we take grids from another model and incorporate them into the WRF environment for the upcoming simulation. The ungrib.exe program unpacks the gridded model information we use as input to WRF.

For our C34 case study, we will force WRF with HRRR grids. I’ve downloaded the necessary grids from the following website, which you may use for other simulations:

http://home.chpc.utah.edu/~u0553130/Brian_Blaylock/cgi-bin/hrrr_downloa…

In order to run a WRF simulation with HRRR grids, you will need to download the following files for each time of interest:

PRS (Pressure, 3D fields)

As noted above, however, model output grids have already been obtained for this case study. The grids may be found in the following directory on Cheyenne:

/glade/work/tjuliano/fire/unr/data

You will notice that there are files ending with wrfprsf00.grib2. These files, which contain the pressure (3D) fields, we will have to be extracted.

We will start by extracting the files using the ungrib.exe program, but we first need to create soft links. To create soft links in WPS, we use a special program called link_grib.csh. To link the HRRR GRIB2 grids, type the following:

./link_grib.csh /glade/work/tjuliano/fire/unr/data/*.grib2

If done successfully, you should see 7 different GRIBFILE files with extensions from AAA to AAG. Each one of these is linked to a different HRRR file.

Before running ungrib.exe, we need to inform WRF about the type of model output files we are using to initialize WRF. This information is passed by way of a file known as a Vtable. It informs WRF as to what variables are found in the GRIBFILE files. WRF has built in different Vtables in your WPS directory under the subdirectory ungrib/Variable_Tables. If you navigate into this directory, then you can see WRF has Vtables for many different model formats. For the HRRR files, we will use the Vtable called Vtable.HRRR. You will need to create a soft link in the WPS directory for this Vtable. In your WPS directory, simply type:

ln -sf ungrib/Variable_Tables/Vtable.HRRR Vtable

Once you have this done, you can run the ungrib.exe program via the terminal (no need to submit a batch script for this program):

./ungrib.exe

where again you may create a log file if you wish. If ungrib.exe has completed processing the HRRRfiles (which should take almost 10 minutes), then you should see a line at the end informing you of the “Successful completion of ungrib”. Also, you should see 7 different files in your WPS directory starting with FILE:2019-02*.

2d) Running metgrid.exe

Congratulations! You’ve made it to the final step of WPS. This last step will take your geo_em.d01.nc and geo_em.d02.nc files, as well as your FILE:* files, and create met_em* files that WRF will use to generate initial and boundary condition forcing files.

Fortunately, this step should be relatively painless. In order to generate our met_em* files, simply submit the runcheyenne_metgrid.sh batch script:

qsub runcheyenne_metgrid.sh

Information will print to the metgrid.log* files, and after processing the final domain (domain 2 in our case), text should announce “Successful completion of metgrid” and you have completed all of the required pre-processing tasks.

Step 3: Running WRF

**A note on running WRF executables: much like with WPS, the real.exe and wrf.exe programs should be submitted as batch jobs on the Cheyenne supercomputer. I have included example job scripts in the test/em_real directory: runcheyenne_real.sh and runcheyenne_wrf.sh for real.exe and wrf.exe, respectively. Again, be sure to set the following line to reflect your Cheyenne project_code to which you would like to charge core hours:

#PBS -A project_code

3a) Configuring the simulation using namelist.input

We have a few steps before we run our C34 WRF-Fire case. First, we need to navigate into the WRF/test/em_real directory. The key file here is namelist.input. Similar to WPS, this is the file that you will use to tell WRF about your model configuration. For our case study of the C34 fire, I’ve created a namelist file called namelist.input.c34; copy that file to namelist.input:

cp namelist.input.c34 namelist.input

Often it is easier to have both your namelist.wps and namelist.input files open since it is necessary to have consistent information in each. For example, make certain the grid sizes, times, etc. match!

Before we can run WRF, we need to first link to the met_em* grids that we created in the WPS directory. The link is simple; in your test/em_real directory, just type:

ln -sf ../../../WPS/met_em* .

and you should see the soft linked files in your directory. Again, for both the inner (d01) and outer (d02) domains, there should be 7 different met_em* files (one for each hour of our simulation).

3b) Running real.exe

The next step is to run the program real.exe. This program vertically interpolates the meteorological fields from the met_em* files to the WRF vertical levels, and it is the final step before running WRF. To run real.exe, type:

qsub runcheyenne_real.sh

Information will print out to the rsl.out.* files. After successful completion, you should see three files: wrfinput_d01, wrfinput_d02 and wrfbdy_d01.

3c) Running wrf.exe

The example simulation that has been prepared will initiate a fire from a point ignition. This is controlled via options in the &fire section of the namelist.input file. Beginning a fire from a perimeter is also possible; however, this process is a bit more involved, as it requires downloading and processing the GIS perimeter files, and modifying the wrfinput.d02 file for WRF. Future information on how to initiate a fire from a perimeter is coming – please stay tuned!

Now you are ready to run a WRF-Fire simulation! Simply submit a batch job for wrf.exe:

qsub runcheyenne_wrf.sh

The simulation should take about 30 minutes of wall clock time before the fire hits the domain boundary.

Contact

Please direct questions/comments about this page to:

Timothy Juliano