Code

Development work flow

 

Code development

Code development is not covered on this website: please see the JULES TRAC for comprehensive resources (starting with the "Developing JULES" section there), but here are a few notes:

  • The best place to start is with Kerry Smout-Day's Developer tutorials here. All proposed changes to the JULES code must follow the Working Practices for JULES development (broadly summarised by the flow diagram shown here) and the JULES Upgrade Procedures.
  • In addition to online tutorials, there are Rose/Cylc and RoseUM training courses held at the Met Office in Exeter regularly throughout the year. Sign-up here
  • JULES does not have a 'known errors' page for any of its versions: all known errors are on the JULES TRAC under individual tickets. If you are aware of any defect in the model, or have ideas for an enhancement, then please help the community by raising a ticket about it (or, for larger issues, you might want to start up a PEG to discuss the issue).

 

Archive versions

For the current version of JULES, please download from the MOSRS using FCM (see Getting Started). For older versions of JULES, please contact Toby Marthews at CEH (contact details in page footer; tarballs for versions 1.0 to 4.2 of JULES have been kept for reference and are available on request *** n.b. a user guide is included in the docs/ directory in each tarball, but using these pre-2015 versions is no longer supported ***).

Release DateVERSIONRelease notes
31-OCT-2014JULESvn4.1 (for vn4.1 and all later versions of JULES, please see the JULES TRAC)Notes
1-JUL-2014JULESvn4.0Notes
9-OCT-2013JULESvn3.4.1Notes
21-AUG-2013JULESvn3.4Notes
10-APR-2013JULESvn3.3Notes
20-NOV-2012JULESvn3.2Notes
28-JUN-2012JULESvn3.1Notes
28-FEB-2011JULESvn3.0 (for a summary of developments up to this point, see here)Notes
 Logo changes from to  
24-NOV-2010JULESvn2.2Notes
2-FEB-2010JULESvn2.1.2Notes
5-JAN-2010JULESvn2.1.1Notes
29-SEP-2009JULESvn2.1Notes
10-JUL-2007JULESvn2.0Notes
02-OCT-2006JULESvn1.0 released at the JULES Launch meeting (see overview here). Original website's welcome page: Notes
MAY-2005JULESvn0 frozen (essentially a combination of TRIFFID and MOSESv2.2) 
AUG-2001MOSES v2.2 (Essery et al. 2001, PDF icon HCTN_30.pdf; essentially this was v2.0 rewritten slightly to accommodate other changes in the Unified Model) 
JAN-2001Top-down Representation of Interactive Foliage and Flora Including Dynamics (TRIFFID) model (Cox 2001, PDF icon HCTN_24.pdf) 
1999MOSES v2.0 (Essery et al. 2003, J Hydromet) 
1997Met Office Surface Exchange Scheme (MOSES) v1.0 (Cox et al. 1999, Clim Dyn) 

 

Release notes

Below are the release notes for each JULES version up to moving over to the version-control system introduced at JULESvn4.1 (for vn4.1 and later see the JULES TRAC):

 

JULES vn4.0

JULES-Crop crop model: JULES vn4.0 sees the introduction of the JULES-Crop crop model. This has been the result of a successful collaboration between University of Reading (Tom Osborne and Josh Hooker) and the Met Office (Jemma Gornall, Andy Wiltshire and Karina Williams).

A lot of the work done in getting it ready for the trunk and testing was done in the Met Office by Karina Williams and Jemma Gornall.

Daily disaggregator for forcing data: JULES can now be driven with daily forcing data, and the daily disaggregator will disaggregate the daily forcing down onto the model timestep. For more information, see l_daily_disagg.

Major namelist changes: JULES vn4.0 also sees a major revamp of the science-related namelists. The monolithic JULES_SWITCHES namelist, and various others, are gone, and have been replaced with science section namelists. For more details, see The JULES namelist files.

This has been with the aim of providing a GUI for editing the JULES namelists using Rose, which is now available - see Automatic upgrading and GUI using Rose.

It also has the advantage that the new namelists are cut-and-paste-able between the UM and JULES, which should make it easier to ensure that the same science is being used in online and offline runs.

Removal of GNU make build files: After a period of supporting two build systems (FCM make and GNU make), it has been decided that support for GNU make should be removed. The overhead of maintaining two build systems was getting too large, and FCM make is preferred for several reasons:

Directory structure:

  • The directory level dependencies used by the JULES Makefile to ensure files are compiled in the correct order forced the directory structure to adapt to it.
  • FCM make does automatic dependency analysis for each file to ensure they are compiled in the correct order, meaning the directory structure doesn’t have to be compromised to keep the build system happy.

Dependencies:

  • The JULES GNU Makefiles required that dependencies be manually maintained, both in terms of the order of sub-makes and actual file dependencies within the sub-makes.
  • FCM make automatically detects all dependencies and does things in the correct order.

Parallel builds:

  • JULES builds with GNU make could not be parallelised, because of the use of directory level sub-makes.
  • FCM make considers each individual file, so builds can be parallelised.

Integration with Rose:

  • FCM make has good integration with Rose, allowing the Rose GUI for JULES to configure and run builds as well as the namelists.

Bugs and other changes:

  • Output for land points not comparing between land_only = T and F runs with 2D grid
  • Incorrect behaviour when spinup_end == data_end
  • Fixed overflow problem with datetime_diff when datetimes are too far apart
  • Removed old implicit solver and ltimer code
  • Unified management of printing and error reporting for UM and standalone

 

JULES vn3.4

n.b. A critical memory leak was found in JULES v3.4 that necessitated a new release, designated v3.4.1.

Changes to semantics of output: The output semantics used since JULES vn3.2 (i.e. state variables captured at the start of a timestep, flux variables captured at the end) were confusing some users. The semi-implicit scheme in JULES is designed so that the state and fluxes at the end of a timestep are consistent with each other, but under the previous semantics these were staggered by one timestep in output files.

All variables are now captured at the end of a timestep, so state and flux variables at a particular timestep in output files will be consistent with each other. A new option has been added to request the output of initial state, however very few users will have a use for this. It is still the case that the value in the time variable can be used to place snapshot data in time, and the values in time_bounds represent the interval over which a mean or accumulation applies.

More details can be found at JULES output.

Input and/or output of variables with multiple ‘levels’ dimensions has been improved: In previous versions of JULES since vn3.1, variables could only be input or output with a single ‘levels’ dimension. In particular, this caused problems with variables in the new snow scheme, which have two ‘levels’ dimensions on top of the grid dimensions (tiles and snow levels). This led to compromises being made with the snow layer variables:

  • It was only possible to initialise the snow layer variables using a constant value, from a previous dump or using total_snow
  • In output files, the snow layer variables were represented using a separate variable for each tile

This problem is solved in JULES vn3.4 - it is now possible to input and output variables with multiple ‘levels’ dimensions (there is not even a restriction to two ‘levels’ dimensions). This means that both compromises for snow layer variables detailed above have been removed.

Streamlined process for adding new variables for input and/or output: Although fairly simple, the process for adding a new variable for input and/or output in JULES vn3.1 - vn3.3 required several edits to be made, and hence provided many opportunities to make mistakes. This process is simplified in JULES vn3.4 to require fewer edits. More details can be found at Implementing new variables for input and output.

Other changes:

  • Tidying of boundary layer code: Some small changes have been made to tidy up some of the boundary layer code (i.e. routines in src/science/surface) - this is mostly removing unused variables and tidying up subroutine argument lists.
  • OpenMP related changes: Some OpenMP directives have been added to certain loops. OpenMP is a form of shared-memory parallelism in which the user inserts directives (specially formatted code comments) providing information that allows the compiler to parallelise sections of code (in particular loops) without worrying about corrupting data. It is used in the UM, but is currently not enabled when compiling JULES standalone.

Bugs fixed:

  • Output (including dump files) not correctly generated for the last spin-up cycle when spin-up fails and terminate_on_spinup_fail = TRUE
  • lw_net diagnostic does not include the contribution from the reflected incoming longwave if the emissivity is less than zero

 

JULES vn3.3

Ability to run JULES in parallel: JULES can now run multiple points in parallel, using multiple cores on the same machine or a cluster of machines. This is accomplished using MPI (Message Passing Interface), a standardised message passing interface. Several implementations of MPI are available, the most commonly used being MPICH2 and OpenMPI.

JULES takes advantage of the parallel I/O features in HDF5 / NetCDF4. These are not enabled by default, and so must be explicitly enabled when HDF5 / NetCDF4 are compiled. More information on how to do this can be found on the NetCDF website.

Information on how to build and run JULES in parallel can be found in the JULES User Guide. Note that although this development has proven stable during testing, it is still experimental and is considered to be for advanced users only.

Changes to documentation: From a users point of view, the most important change is that the JULES documentation and coding standards are now provided in two forms - HTML (this is the preferred format) and PDF. The HTML documentation is also available on the web at http://jules-lsm.github.io/.

This has been made possible by migrating the documentation from a single massive Word document to the Sphinx documentation generator (with some custom extensions to better support Fortran namelists). Although originally intended to document Python projects, Sphinx’s extensibility has seen it adopted for a wide range of projects. Using Sphinx has several advantages over the previous monolithic Word document:

  • Both forms of documentation (HTML and PDF) can be built from the same sources.
  • The documentation is now split into several smaller files that are combined by Sphinx at build-time, leading to increased readability.
  • reStructuredText, the markup language used by Sphinx, is a plain text format, meaning that it can be version controlled much more effectively than a Word document (which is treated by Subversion as a single binary entity).
  • The only software required to update the documentation is your favourite text editor (rather than Word).

The JULES repository on PUMA has also been refactored so that configurations, documentation and examples sit in a separate project to the core Fortran code.

Other changes:

  • Disambiguation of sea ice roughness lengths for heat and momentum: Prior to vn3.3, these were implicitly assumed to be equal by the code. They can now be set separately in the namelist JULES_SURF_PARAM.
  • Improvements to the numerics in the soil hydrology: Previously, the soil hydrology scheme coped poorly with significant gradients in soil moisture because of the sensitive dependence of the hydraulic conductivity and soil water suction on the soil moisture. See the new switch l_dpsids_dsdz.
  • Implicit numerics for land ice: Previously, the updating of land ice temperatures was always explicit, limiting the thickness of soil levels that can be used with standard time steps. There is now an option for implicit numerics for land ice - see the new switch l_land_ice_imp.
  • Scaling of land surface albedo to agree with a given input: An option has been added to prescribe the grid-box mean snow-free albedo to a given input (e.g. observations, climatology). See the new switch l_albedo_obs. For SW albedos, the albedos of the individual tiles are scaled linearly so that the grid-box mean albedo matches the observations, within limits for each tile. When VIS and NIR albedos are required then the input parameters are scaled and corrected in a similar manner. The change was included in the Global Land configuration at vn5.0: http://collab.metoffice.gov.uk/trac/GL/ticket/8.
  • BVOC emissions now on a switch: Previously, BVOC emissions diagnostics were calculated all the time, regardless of whether they were output. A new switch - l_bvoc_emis - has been added to enable the calculation of these diagnostics only when required.
  • Improvements to logging: A new namelist file - logging.nml - has been added to give more control over log output from JULES. Previously all output was directed to stdout.
  • Specify namelist directory as an argument: It is now possible to specify the directory containing the namelist files as a command line argument to JULES. If no argument is given, JULES looks for the namelist files in the current working directory. Previously, JULES had to be executed in the directory containing the namelists - this change should make it easier to run JULES in batch mode.

Bugs fixed:

  • Initialisation of chr1p5m and resfs in sf_exch.
  • Fix for potential divide-by-zero in sf_stom when running with can_rad_mod = 1.
  • Various UM-related fixes not relevant to standalone JULES (ENDGAME, aerosol deposition scheme, etc.).

 

JULES vn3.2

JULES version 3.2 sees several enhancements and bug fixes in both the science and control code.

Standard Configurations: A set of standard science configurations have been defined. These are based on well tested operational Met Office models, and are intended to cover a wide range of use cases.

Improvements to output: In JULES version 3.1, under some circumstances, it was not entirely clear how the timestamps in output files applied to the values. This has been thoroughly addressed in version 3.2.

Changes have also been made to the attributes of output variables:

  • The units attribute for output variables has been updated to be compliant with UDUNITS2.
  • A CF convention coordinates attribute has been added to all output variables that explicitly links the latitude and longitude to the data.

Biogenic Volatile Organic Compound (BVOC) emissions: Code written by Federica Pacifico for isoprene emissions has been implemented and extended to include monoterpene, acetone and methanol emissions. This addition is purely diagnostic in the standalone model (i.e. provides new output variables, but has no feedbacks), but will allow the UM to implement interactive BVOC emissions (i.e. with feedbacks) in the future. A paper has been written describing and evaluating the isoprene emission scheme - Pacifico et al. (2011).

Alternative build system: It is now possible to build JULES using FCM make. FCM is a set of tools developed by the Met Office for managing and building source code, with a particular focus on making it easy to build large Fortran programs (such as JULES). FCM is open source software, and can be downloaded for free from the Met Office website or from the Met Office Github repository.

Bugs fixed:

  • Array bounds error with SICE_INDEX_NCAT.
  • Incorrect usage of COR_MO_ITER.
  • Monthly/yearly output files not rolling over properly on certain configurations of GFortran.
  • A collection of small memory leaks.
  • Not able to read or write ASCII dumps with the new snow scheme on.
  • Use fixed dimension names for output files (rather than using those given for input files).
  • Using can_rad_mod = 5 causes night-time dark respiration to be 0 under certain circumstances.

 

JULES vn3.1

JULES version 3.1 sees little change to the science of JULES, but contains several major developments intended to make development easier going forward:

Restructuring of the code: The directory structure of the JULES code has been changed to be more logical and allow for a cleaner separation between control, initialisation, I/O and science code. This includes the introduction of directories containing UM-specific code for initialisation in the UM. This was done as part of the work to completely remove (MOSES and) JULES code from the UM code repository - it now sits in its own code repository (i.e. there is now a single code repository for both standalone JULES and JULES in the UM, which means the UM no longer contains any land surface model code and requires code from the JULES repository in order to create an executable).

New I/O framework: The input and output code has been completely revamped in order to modularise and simplify the code. It allows for data to be input on any time-step and interpolated down to the model time-step. Support for outputting of means and accumulations remains. NetCDF is now the only supported binary format (although it should be relatively simple to write drivers for other output formats if desired), and ASCII files are allowed for data at a single location only. Support for the GrADS flat binary format has been dropped, although the NetCDF output should be usable with GrADS with very little work.

User Interface changes: The user interface also sees significant changes. The monolithic run control file has been replaced by several smaller files containing Fortran namelists for input of options and parameters. This is more consistent with the UM, and offers the opportunity to adapt UM tools to provide a GUI for running JULES in the future.

Other changes: There are several not-insignificant changes to the science code:

  • Structures are now used for dimensioning variables - this allows for more flexibility of grids than the old system of row_length/rows and halos.
  • Move to a new implicit solver - sf_impl2 is now used rather than sf_impl for consistency with the UM. However, the way the implicit coupling is set up means it operates in a similar way to the old scheme.
  • A change in the way fresh snow is handled in the multi-layer snow scheme – the density of fresh snow is now prescribed by a new variable (rho_snow_fresh). Suggested by Cécile Ménard and implemented by Doug Clark.
  • Bug fix from Doug Clark for the multi-layer snow scheme that fixes problems with the model oscillating between 0 and 1 snow layers every time-step, preventing snow melt.
  • Changes to the sea-ice surface exchange when operating as part of the UM. This will not affect the majority of users.
  • Slight changes to the coupling between the explicit and implicit schemes. The vast majority of users will not need to worry about this.

 

JULES vn3.0

The major change in version 3.0 is the introduction of IMOGEN impacts tool. IMOGEN is a system where JULES is gridded on to surface land points, and is forced with an emulation of climate change using “pattern-scaling” calibrated against the Hadley Centre GCM. This climate change - impacts system has the advantage that:

  • The pattern-scaling allows estimates of climate change for a broad range of emissions scenarios
  • New process understanding can be tested for its global implications
  • New process understanding can also be checked for stability before full inclusion in a GCM
  • By adding climate change anomalies to datasets such as the CRU dataset, then GCM biases can be removed

It must be recognised that the system is “off-line”, and so if major changes to the land surface occur, there might be local and regional feedbacks that can only be predicted using a fully coupled GCM. Hence IMOGEN doesn’t replace GCMs, but it does give a very powerful first-look as to potential land surface changes in an anthropogenically forced varying climate. This was accomplished with help from Mark Lomas at the University of Sheffield and Chris Huntingford at CEH.

There are also several small bug fixes:

  • A fix effecting fluxes in sf_stom from Lina Mercado at CEH. This bug fix was announced on the JULES mailing list
  • Small fixes for potential evaporation and canopy snow depth from the UM
  • A small issue with some memory not being deallocated at the end of a run

 

JULES vn2.2

Along with fixes for known bugs, the changes made for version 2.2 mostly consist of several small additions to the science code. Changes to the control code have mostly been limited to bug-fixes.

  • New options for treatment of urban tiles - inclusion of the Met Office Reading Urban Surface Exchange Scheme (MORUSES) and a simple two tile urban scheme
  • Effects of ozone damage on stomata from Stephen Sitch at the University of Leeds
  • New treatment of direct/diffuse radiation in the canopy from Lina Mercado at CEH
  • A new switch allows the competing vegetation portion of TRIFFID to be switched on and off independently of the rest of TRIFFID (i.e. it is now possible to use the RothC soil carbon without having changing vegetation fractions)

There have also been changes made to the way JULES is compiled, due to the re-integration with the Met Office Unified Model. The Unified Model uses pre-processor directives to compile different versions of routines depending on the selected science options. For compatibility with this system, JULES now requires a compiler with a pre-processor. This should not be noticed by the majority of users – most modern compilers include a pre-processor and the Makefile deals with setting up the appropriate pre-compiler options.

Finally, JULES was added to the UM code repository as a mirror of the JULES repository at (UM version vn7.5, JULES vn2.2).

 

JULES vn2.1

 

Versions 2.1.1 and 2.1.2 were released to fix major bugs found in v2.1 - they contain no new features.

Version 2.1 of JULES includes extensive modifications to the descriptions of the processes and to the control-level code (such as input and output). These are covered briefly below. Several bug fixes and minor changes to make the code more robust have also been applied. All files are now technically FORTRAN90 (.f90) although many are simply reformatted FORTRAN77 files in which continuation lines are now indicated by the use of the ‘&’ character.

Process descriptions: The main change is that a new multi-layer snow scheme is available. This scheme was developed by Richard Essery at the University of Edinburgh and co-workers. The older, simple scheme represents the snowpack as a single layer with prescribed properties such as density, whereas the new scheme has a variable number of layers according to the depth of snow present, and each layer has prognostic temperature, density, grain size, and solid and liquid water content. The new scheme reverts to the previous, simpler scheme when the snowpack becomes very thin.

A four-pool soil carbon model based on the RothC model now replaces the single pool model when dynamic vegetation (TRIFFID) is selected.

There have been several major changes that most users will not notice or need be concerned about. These include:

  • A change in the linearization procedure that is used in the calculation of surface energy fluxes (described in the technical documentation)
  • A standard interface is now used to calculate fluxes over land, sea and sea ice
  • Each surface tile now has an elevation relative to the gridbox mean

These changes mean that, even with the new snow scheme switched off (nsmax=0), results from v2.1 will generally not be identical to those from v2.0.

Control-level code: The major change at v2.1 to the control-level code is that NetCDF output is now supported. Both diagnostic and restart files (dumps) can be in netCDF format. There have been several changes to the run control file, partly to reflect new science but also in an attempt to organise the file better. These changes mean that run control and restart files from JULES v2.0 are not compatible with v2.1 (although they could be reformatted without too much difficulty).

Finally, since JULESvn1.0 the MOSES and JULES code bases have been evolving separately, but with JULES v2.1 these differences have been reconciled with the UM.

 

JULES vn2.0

The physical processes and their representation in version 2.0 have not changed from version 1. However, version 2.0 is much more flexible in terms of input and output, and allows JULES to be run on a grid of points. New features include:

  • Ability to run on a grid
  • Choice of ASCII or binary formats for input and output files (also limited support of NetCDF input)
  • More flexible surface types – number and types can vary
  • Optional time-varying, prescribed vegetation properties
  • More choice of meteorological input variables
  • Optional automatic spin-up
  • Enhanced diagnostics – large choice of variables, frequency of output, sampling frequency, etc.

 

JULES vn1.0

Initial public release of the JULES code. JULES v1.0 will only run for a single point and only supports ASCII data.