The JULES Community
The JULES model is available cost-free to any researcher for non-commercial use, and this has led to a large and diverse community from all over the globe. Although the majority of researchers using JULES are based in the U.K., requests for the JULES code have come from places as diverse as China, India, New Zealand and South America.
Here's how to get started as a JULES user:
- The first step is to request access to the code here. Registration is free and open to all for non-commercial use (registration is required just to keep track of who is using JULES, not to restrict usage). When you register, you will automatically be set up with a Met Office Science Repository Service (MOSRS) login (may take a few days), which will allow you to access the JULES TRAC.
- Next, bookmark the main JULES manual pages and the online manual for the version of JULES you'll be using (I have the namelist section bookmarked - for JULESvn4.8 this is here - because I continually use the search box there to look up what particular model variables mean).
- Now do at least one of the tutorials on our training page, for example Toby's JULES From Scratch.
- JULES general interest: jules*A*T*lists.reading.ac.uk (subscribe here, although this might happen automatically when you register for MOSRS)
- JULES users: jules-users*A*T*lists.reading.ac.uk (subscribe here, although this might happen automatically when you register for MOSRS)
- JULES support: jules-support*A*T*metoffice.gov.uk for all technical issues with JULES, Rose, Cylc or FCM, however please check the documentation and tutorials on the JULES TRAC and JULES manual pages before emailing a support request.
- Finally, there are also the Met Office JULES Announce and JULES General Discussions Yammer groups (email Scientific_Partnerships@metoffice.gov.uk for an invitation).
In common with all land surface models, JULES needs three elements to perform a simulation: driving data, ancillary/prescribed data and control files (together these are called the 'model configuration'). Here is a list of standard configurations for JULES.
Users are encouraged as far as possible to work from one of the CORE CONFIGURATIONS, although there are also non-core configurations in use for research and other purposes. The idea is that, as much as you can, you DON'T assemble your own configuration from scratch, but rather choose one of the pre-assembled configurations that is similar to what you want to do, download it (this is free) and modify its options and settings appropriately for use in your particular application on your particular system. Ideally, if your configuration gives good results and you've published it, please then contact us about it (see email address below) and we can add your configuration to the list for others to use.
Here are some links to help you select the most appropriate configuration components for your project:
These are met. data or climate data, which can either be obtained from standard open-access datasets (see table below; usually in NetCDF format) or from a met. station (as in e.g. the Loobos example or studies like Marthews et al. 2012). Basically, JULES requires data on Radiation (downward SW and LW), Precipitation (rainfall and snowfall), Temperature, Surface pressure, Specific humidity and Wind speed (more specifically, the variables listed here). These variables must be available for every timestep, although JULES can do some interpolation/disaggregation automatically if the simulation timestep differs from the driving data timestep (e.g. 3 hourly data to half hourly or daily data to hourly). JULES-appropriate sources for driving data include:
|Data source||Extent||Resolution||Time Range||Comments|
|CHESS||U.K. (excl. Shetland & N. Ireland)||1 km||1961-2015||2016 article|
|WATCH Forcing data ERA Interim (WFDEI)||Global||0.5°|
(this includes Rainf_WFDEI_CRU and Snowf_WFDEI_CRU. However, Rainf_WFDEI_GPCC and Snowf_WFDEI_GPCC currently still extend to the end of 2013 (awaiting updates of the GPCC full data product))
In general it is recommended that these 3-hourly driving data are interpolated (by JULES) to the model timestep as follows:
(1) Interpolate using "i" for Tair, PSurf, Qair and Wind; "b" for SWdown and LWdown and "nb" for Rainf and Snowf (see interpolation flags).
(2) Use a simulation timestep that divides the driving data timestep into even-integer parts (e.g. 30 min to divide into 6 parts), which ensures that incoming radiative energy is not lost/created during “b” interpolation.
|CRU-NCEP v4||Global||0.5°||1901-2012||Viovy & Ciais (2011)|
|Princeton||Global||1.0° / 0.5° / 0.25°||1948-2010||Sheffield et al. (2006)|
|GSWP3||Global||0.5°||1850-2010||Website under construction: can't yet download data|
|CHELSA||Global||30 arc-sec||1979-2013||Based on ERA Interim|
(n.b. there are many other data products and alternative sources, e.g. see the links here). If you have not been provided with driving data for your project, by far the best initial source is to consult the (uncoupled versions of) standard JULES configurations and see what they use.
Spatial datasets, e.g. soil properties (hydraulic and thermal parameter values), land-cover (frac; vegetation and non-vegetation) and DEM-derived gridded flow-directions and accumulated areas. If these have been provided as time-varying ancillaries then they are called prescribed data by JULES (other ancillary data are constant). For non-point runs, usually all in NetCDF format. JULES runs require these (non-time-varying) ancillary fields and these (time-varying) prescribed data fields, usually at the same resolution as your driving data:
|Driving data is for:||Source for land cover fraction (frac)||Leaf area index (LAI)|
|A point run||Use site data||Use site data||Recommended values are in the standard configuration control files (see below)||If soil data are available, use pedotransfer functions (see e.g. Marthews et al. 2014)|
|A gridded run||See IGBP or USGS||See GLCF||See TRY|
n.b. there are MANY other sources: these are just some starter suggestions (if anyone wants to contribute to this page by expanding this table then pls email me! - Toby, May 2017).
If you have not been provided with ancillary data for your project, by far the best initial source is to consult the standard JULES configurations and see what they use.
The UK Met Office Central Ancillary Program (CAP): The Met Office Unified Model (UM) uses the CAP for creating ancillary data files on appropriate grids for use with the UM (see User Guides here (requires password) or here or [very old] docs #70 and #73 here). The related tool Xancil is an application written by Jeff Cole for creating UM ancillary files from netCDF input files. As of 2015, CAP was running on ARCHER too. However, there were several critical weaknesses recognised in the CAP system, described in this position paper by Keir Bovis in 2012, and the TIAN program was initiated to produce a replacement, which will be the new ANTS system (due for public release in 2017, whereupon it will begin to replace the CAP; see documentation here and I think a pre-release version can be downloaded by Met Office internal staff here).
There is no central repository for JULES ancillaries. In the absence of this, ancillaries for JULES runs are difficult to come by and usually assembled for each individual project from one or other of the sources above (including use of some converted UM ancillaries). With the Rosie Go system (see below) the situation has improved: it's now possible to download Rose suites from projects similar to your own, look inside to see what ancillary files have been used and then email the suite owner to seek permission to use those files. Before ANTS becomes widely available, this is probably the most direct way to acquire the ancillary and prescribed data files you may need.
Currently JULES's control files must be in the form of either (i) a set of Fortran namelist files (model parameter input .nml files) or (ii) a Rose suite (a suite is effectively a bundle of control files, including the same namelists as used for (i) ).
- ROSE SUITES: There are three ways to get hold of Rose suites:
- Through downloading a standard JULES configuration, or
- Through the Rose Suite Discovery Engine Rosie Go (part of your Rose installation; see here for background and here for how to do this)
- By searching the online repository for a particular Rose suite, e.g. searching for u-am539 here leads you to this page here (where you can see the changeset if you click on the number (not the cog symbol) in the Rev column).
- NAMELIST FILES: Sets of namelists are not usually distributed around the JULES community any more (except as part of Rose suites), but if you are working from a set of these supplied from elsewhere, then you can either (i) run JULES without Rose (see here) or (ii) use create_rose_app to convert them into a Rose suite and use that to run JULES with Rose (see here).
The namelists JULES requires for (i) are described on the JULES Manuals pages (select your version of JULES and find section "The JULES namelist files") and this is the best source for details of both these kinds of control files (both the Rose suite and the namelist files are composed of the same set of namelists).
Gridded runs from Land Surface Models (LSMs) in general can output data in 2D or 1D NetCDF files and you usually need to use different software to visualise/analyse these two types of gridded output. However, please be aware that JULES will (almost) always give you 1D output from gridded runs, whether the driving data are 1D or 2D (only in a few specific cases will the output be 2D: details are in the JULES Manual under section 6.13: model_grid.nml). This means that you either need to use analysis software that can deal with 1D files or you need to anticipate a post-processing step to convert your 1D JULES outputs into 2D files (for example, Alberto Martínez post-processed 1D output to generate the 2D results of the EU earth2observe project downloadable from the WCI Portal).
Data visualisation tools include:
- For 2D NetCDF output: On UNIX/Linux people seem to use GrADS (see also here and here), Xconv, Iris (see also Cube Browser and Thea) or Ncview. On Windows you can use Panoply (very quick, but limited) or a GIS package like QGIS (see also here; has a lot of advantages if you want to compare JULES output to other raster or vector data, e.g. to calculate land surface fluxes over an irregular area like a river catchment).
- For 1D 'land axis' NetCDF output: On UNIX/Linux you can use GrADS. You can display a 1D NetCDF file in GrADS if you can generate a .pdef file that correctly describes the grid of the datafile you have. See here for how to construct a pdef file for gridded output. Another option on UNIX/linux is xmgrace/grace.
(see presentation by Emma Robinson on Training and the data visualisation examples there).
During the process of analysing your data, a post-processing step is often required. This step is not done automatically by JULES: you need to tackle it yourself, usually with a bespoke script/program.Here are some shared example scripts from the JULES community:
|Post-processing option||Contributed by||Description||Source|
|Scripts supplied June 2017 from global WFDEI runs||Darren Slevin|
As DS put it: "The python script does the actual regridding. The bash script just calls the python script for multiple years/months (this can be changed for different timesteps). I have also included the WFDEI files which describe the location of the gridboxes on a 2d grid. This is needed to place the model output from the 1d array to the 2d grid. This code works for a 1-level global grid of GPP (which can be modified). ... I have provided code for 3 different spatial resolutions (half, 1 and 2 degree resolution)".
|JULES tools||Toby Marthews||Miscellaneous *** unsupported *** tools||Link|
|CLM Post-Processing and Analysis Utilities||-||-||Link|
n.b. post-processing options are almost universally personally-written scripts/programs because the variety in grids and masks used for LSM simulations is so wide that a generalised post-processor is extremely difficult to code up. Therefore, please use these contributed code snippets AT YOUR OWN RISK and NONE of them should be expected to work 'out of the box' without any code modifications.
For JASMIN users, be aware of the JASMIN Analysis Platform.
Finally, see here for evaluation / benchmarking.