Parcels documentation

parcels.particleset module

class parcels.particleset.ParticleSet(fieldset, pclass=<class 'parcels.particle.JITParticle'>, lon=None, lat=None, depth=None, time=None)[source]

Bases: object

Container class for storing particle and executing kernel over them.

Please note that this currently only supports fixed size particle sets.

Parameters:
  • fieldsetparcels.fieldset.FieldSet object from which to sample velocity
  • pclass – Optional parcels.particle.JITParticle or parcels.particle.ScipyParticle object that defines custom particle
  • lon – List of initial longitude values for particles
  • lat – List of initial latitude values for particles
  • depth – Optional list of initial depth values for particles. Default is 0m
  • time – Optional list of initial time values for particles. Default is fieldset.U.time[0]
Kernel(pyfunc)[source]

Wrapper method to convert a pyfunc into a parcels.kernel.Kernel object based on fieldset and ptype of the ParticleSet

ParticleFile(*args, **kwargs)[source]

Wrapper method to initialise a parcels.particlefile.ParticleFile object from the ParticleSet

add(particles)[source]

Method to add particles to the ParticleSet

density(field=None, particle_val=None, relative=False, area_scale=True)[source]

Method to calculate the density of particles in a ParticleSet from their locations, through a 2D histogram

Parameters:
  • field – Optional parcels.field.Field object to calculate the histogram on. Default is fieldset.U
  • particle_val – Optional list of values to weigh each particlewith
  • relative – Boolean to control whether the density is scaled by the total number of particles
  • area_scale – Boolean to control whether the density is scaled by the area (in m^2) of each grid cell
execute(pyfunc=<function AdvectionRK4>, starttime=None, endtime=None, dt=1.0, runtime=None, interval=None, recovery=None, output_file=None, show_movie=False)[source]

Execute a given kernel function over the particle set for multiple timesteps. Optionally also provide sub-timestepping for particle output.

Parameters:
  • pyfunc – Kernel function to execute. This can be the name of a defined Python function or a parcels.kernel.Kernel object. Kernels can be concatenated using the + operator
  • starttime – Starting time for the timestepping loop. Defaults to 0.0.
  • endtime – End time for the timestepping loop
  • runtime – Length of the timestepping loop. Use instead of endtime.
  • dt – Timestep interval to be passed to the kernel
  • interval – Interval for inner sub-timestepping (leap), which dictates the update frequency of file output and animation.
  • output_fileparcels.particlefile.ParticleFile object for particle output
  • recovery – Dictionary with additional :mod:parcels.kernels.error recovery kernels to allow custom recovery behaviour in case of kernel errors.
  • show_movie – True shows particles; name of field plots that field as background
classmethod from_field(fieldset, pclass, start_field, size, mode='monte_carlo', depth=None, time=None)[source]

Initialise the ParticleSet randomly drawn according to distribution from a field

Parameters:
  • fieldsetparcels.fieldset.FieldSet object from which to sample velocity
  • pclass – mod:parcels.particle.JITParticle or parcels.particle.ScipyParticle object that defines custom particle
  • start_field – Field for initialising particles stochastically according to the presented density field.
  • size – Initial size of particle set
  • mode – Type of random sampling. Currently only ‘monte_carlo’ is implemented
  • depth – Optional list of initial depth values for particles. Default is 0m
  • time – Optional start time value for particles. Default is fieldset.U.time[0]
classmethod from_line(fieldset, pclass, start, finish, size, depth=None, time=None)[source]

Initialise the ParticleSet from start/finish coordinates with equidistant spacing Note that this method uses simple numpy.linspace calls and does not take into account great circles, so may not be a exact on a globe

Parameters:
  • fieldsetparcels.fieldset.FieldSet object from which to sample velocity
  • pclass – mod:parcels.particle.JITParticle or parcels.particle.ScipyParticle object that defines custom particle
  • start – Starting point for initialisation of particles on a straight line.
  • finish – End point for initialisation of particles on a straight line.
  • size – Initial size of particle set
  • depth – Optional list of initial depth values for particles. Default is 0m
  • time – Optional start time value for particles. Default is fieldset.U.time[0]
classmethod from_list(fieldset, pclass, lon, lat, depth=None, time=None)[source]

Initialise the ParticleSet from lists of lon and lat

Parameters:
  • fieldsetparcels.fieldset.FieldSet object from which to sample velocity
  • pclass – mod:parcels.particle.JITParticle or parcels.particle.ScipyParticle object that defines custom particle
  • lon – List of initial longitude values for particles
  • lat – List of initial latitude values for particles
  • depth – Optional list of initial depth values for particles. Default is 0m
  • time – Optional list of start time values for particles. Default is fieldset.U.time[0]
remove(indices)[source]

Method to remove particles from the ParticleSet, based on their indices

show(particles=True, show_time=None, field=True, domain=None, land=False, vmin=None, vmax=None, savefile=None)[source]

Method to ‘show’ a Parcels ParticleSet

Parameters:
  • particles – Boolean whether to show particles
  • show_time – Time at which to show the ParticleSet
  • field – Field to plot under particles (either True, a Field object, or ‘vector’)
  • domain – Four-vector (latN, latS, lonE, lonW) defining domain to show
  • land – Boolean whether to show land (in field=’vector’ mode only)
  • vmin – minimum colour scale (only in single-plot mode)
  • vmax – maximum colour scale (only in single-plot mode)
  • savefile – Name of a file to save the plot to

parcels.fieldset module

class parcels.fieldset.FieldSet(U, V, fields={})[source]

Bases: object

FieldSet class that holds hydrodynamic data needed to execute particles

Parameters:
add_constant(name, value)[source]

Add a constant to the FieldSet. Note that all constants are stored as 32-bit floats. While constants can be updated during execution in SciPy mode, they can not be updated in JIT mode.

Parameters:
  • name – Name of the constant
  • value – Value of the constant (stored as 32-bit float)
add_field(field)[source]

Add a parcels.field.Field object to the FieldSet

Parameters:fieldparcels.field.Field object to be added
add_periodic_halo(zonal=False, meridional=False, halosize=5)[source]

Add a ‘halo’ to all parcels.field.Field objects in a FieldSet, through extending the Field (and lon/lat) by copying a small portion of the field on one side of the domain to the other.

Parameters:
  • zonal – Create a halo in zonal direction (boolean)
  • meridional – Create a halo in meridional direction (boolean)
  • halosize – size of the halo (in grid points). Default is 5 grid points
advancetime(fieldset_new)[source]

Replace oldest time on FieldSet with new FieldSet

Parameters:fieldset_new – FieldSet snapshot with which the oldest time has to be replaced
eval(x, y)[source]

Evaluate the zonal and meridional velocities (u,v) at a point (x,y)

Parameters:
  • x – zonal point to evaluate
  • y – meridional point to evaluate
Return u, v:

zonal and meridional velocities at point

fields

Returns a list of all the parcels.field.Field objects associated with this FieldSet

classmethod from_data(data, dimensions, transpose=True, mesh='spherical', allow_time_extrapolation=True, time_periodic=False, **kwargs)[source]

Initialise FieldSet object from raw data

Parameters:
  • data – Dictionary mapping field names to numpy arrays. Note that at least a ‘U’ and ‘V’ numpy array need to be given
  • dimensions – Dictionary mapping field dimensions (lon, lat, depth, time) to numpy arrays. Note that dimensions can also be a dictionary of dictionaries if dimension names are different for each variable (e.g. dimensions[‘U’], dimensions[‘V’], etc).
  • transpose – Boolean whether to transpose data on read-in
  • mesh

    String indicating the type of mesh coordinates and units used during velocity interpolation:

    1. spherical (default): Lat and lon in degree, with a correction for zonal velocity U near the poles.
    2. flat: No conversion, lat/lon are assumed to be in m.
  • allow_time_extrapolation – boolean whether to allow for extrapolation
  • time_periodic – boolean whether to loop periodically over the time component of the FieldSet This flag overrides the allow_time_interpolation and sets it to False
classmethod from_nemo(basename, uvar='vozocrtx', vvar='vomecrty', indices={}, extra_fields={}, allow_time_extrapolation=False, time_periodic=False, **kwargs)[source]

Initialises FieldSet data from files using NEMO conventions.

Parameters:
  • basename – Base name of the file(s); may contain wildcards to indicate multiple files.
  • extra_fields – Extra fields to read beyond U and V
  • indices – Optional dictionary of indices for each dimension to read from file(s), to allow for reading of subset of data. Default is to read the full extent of each dimension.
  • allow_time_extrapolation – boolean whether to allow for extrapolation
  • time_periodic – boolean whether to loop periodically over the time component of the FieldSet This flag overrides the allow_time_interpolation and sets it to False
classmethod from_netcdf(filenames, variables, dimensions, indices={}, mesh='spherical', allow_time_extrapolation=False, time_periodic=False, **kwargs)[source]

Initialises FieldSet data from files using NEMO conventions.

Parameters:
  • filenames – Dictionary mapping variables to file(s). The filepath may contain wildcards to indicate multiple files, or be a list of file.
  • variables – Dictionary mapping variables to variable names in the netCDF file(s).
  • dimensions – Dictionary mapping data dimensions (lon, lat, depth, time, data) to dimensions in the netCF file(s). Note that dimensions can also be a dictionary of dictionaries if dimension names are different for each variable (e.g. dimensions[‘U’], dimensions[‘V’], etc).
  • indices – Optional dictionary of indices for each dimension to read from file(s), to allow for reading of subset of data. Default is to read the full extent of each dimension.
  • mesh

    String indicating the type of mesh coordinates and units used during velocity interpolation:

    1. spherical (default): Lat and lon in degree, with a correction for zonal velocity U near the poles.
    2. flat: No conversion, lat/lon are assumed to be in m.
  • allow_time_extrapolation – boolean whether to allow for extrapolation
  • time_periodic – boolean whether to loop periodically over the time component of the FieldSet This flag overrides the allow_time_interpolation and sets it to False
write(filename)[source]

Write FieldSet to NetCDF file using NEMO convention

Parameters:filename – Basename of the output fileset

parcels.field module

parcels.field.CentralDifferences(field_data, lat, lon)[source]

Function to calculate gradients in two dimensions using central differences on field

Parameters:
  • field_data – data to take the gradients of
  • lat – latitude vector
  • lon – longitude vector
Return type:

gradient of data in zonal and meridional direction

class parcels.field.Field(name, data, lon, lat, depth=None, time=None, transpose=False, vmin=None, vmax=None, time_origin=0, units=None, interp_method='linear', allow_time_extrapolation=None, time_periodic=False)[source]

Bases: object

Class that encapsulates access to field data.

Parameters:
  • name – Name of the field
  • data – 2D, 3D or 4D array of field data
  • lon – Longitude coordinates of the field
  • lat – Latitude coordinates of the field
  • depth – Depth coordinates of the field
  • time – Time coordinates of the field
  • transpose – Transpose data to required (lon, lat) layout
  • vmin – Minimum allowed value on the field. Data below this value are set to zero
  • vmax – Maximum allowed value on the field Data above this value are set to zero
  • time_origin – Time origin of the time axis
  • units – type of units of the field (meters or degrees)
  • interp_method – Method for interpolation
  • allow_time_extrapolation – boolean whether to allow for extrapolation
  • time_periodic – boolean whether to loop periodically over the time component of the Field This flag overrides the allow_time_interpolation and sets it to False
add_periodic_halo(zonal, meridional, halosize=5)[source]

Add a ‘halo’ to all Fields in a FieldSet, through extending the Field (and lon/lat) by copying a small portion of the field on one side of the domain to the other.

Parameters:
  • zonal – Create a halo in zonal direction (boolean)
  • meridional – Create a halo in meridional direction (boolean)
  • halosize – size of the halo (in grid points). Default is 5 grid points
ctypes_struct

Returns a ctypes struct object containing all relevant pointers and sizes for this field.

depth_index(depth, lat, lon)[source]

Find the index in the depth array associated with a given depth

eval(time, x, y, z)[source]

Interpolate field values in space and time.

We interpolate linearly in time and apply implicit unit conversion to the result. Note that we defer to scipy.interpolate to perform spatial interpolation.

classmethod from_netcdf(name, dimensions, filenames, indices={}, allow_time_extrapolation=False, **kwargs)[source]

Create field from netCDF file

Parameters:
  • name – Name of the field to create
  • dimensions – Variable names for the relevant dimensions
  • filenames – Filenames of the field
  • indices – indices for each dimension to read from file
  • allow_time_extrapolation – boolean whether to allow for extrapolation
gradient(timerange=None, lonrange=None, latrange=None, name=None)[source]

Method to create gradients of Field

interpolator2D(t_idx, z_idx=None)[source]

Provide a SciPy interpolator for spatial interpolation

Note that the interpolator is configured to return NaN for out-of-bounds coordinates.

interpolator3D(idx, z, y, x)[source]

Scipy implementation of 3D interpolation, by first interpolating in horizontal, then in the vertical

show(with_particles=False, animation=False, show_time=0, vmin=None, vmax=None)[source]

Method to ‘show’ a Field using matplotlib

Parameters:
  • with_particles – Boolean whether particles are also plotted on Field
  • animation – Boolean whether result is a single plot, or an animation
  • show_time – Time at which to show the Field (only in single-plot mode)
  • vmin – minimum colour scale (only in single-plot mode)
  • vmax – maximum colour scale (only in single-plot mode)
spatial_interpolation(tidx, z, y, x)[source]

Interpolate horizontal field values using a SciPy interpolator

temporal_interpolate_fullfield(tidx, time)[source]

Calculate the data of a field between two snapshots, using linear interpolation

Parameters:
  • tidx – Index in time array associated with time (via time_index())
  • time – Time to interpolate to
Return type:

Linearly interpolated field

time_index(time)[source]

Find the index in the time array associated with a given time

Note that we normalize to either the first or the last index if the sampled value is outside the time value range.

write(filename, varname=None)[source]

Write a Field to a netcdf file

Parameters:
  • filename – Basename of the file
  • varname – Name of the field, to be appended to the filename
class parcels.field.Geographic[source]

Bases: parcels.field.UnitConverter

Unit converter from geometric to geographic coordinates (m to degree)

class parcels.field.GeographicPolar[source]

Bases: parcels.field.UnitConverter

Unit converter from geometric to geographic coordinates (m to degree) with a correction to account for narrower grid cells closer to the poles.

parcels.particle module

class parcels.particle.ScipyParticle(lon, lat, fieldset, depth=0.0, dt=1.0, time=0.0, cptr=None)[source]

Bases: parcels.particle._Particle

Class encapsulating the basic attributes of a particle, to be executed in SciPy mode

Parameters:
  • lon – Initial longitude of particle
  • lat – Initial latitude of particle
  • depth – Initial depth of particle
  • fieldsetparcels.fieldset.FieldSet object to track this particle on
  • dt – Execution timestep for this particle
  • time – Current time of the particle

Additional Variables can be added via the :Class Variable: objects

class parcels.particle.JITParticle(*args, **kwargs)[source]

Bases: parcels.particle.ScipyParticle

Particle class for JIT-based (Just-In-Time) Particle objects

Parameters:
  • lon – Initial longitude of particle
  • lat – Initial latitude of particle
  • fieldsetparcels.fieldset.FieldSet object to track this particle on
  • dt – Execution timestep for this particle
  • time – Current time of the particle

Additional Variables can be added via the :Class Variable: objects

Users should use JITParticles for faster advection computation.

class parcels.particle.Variable(name, dtype=<type 'numpy.float32'>, initial=0, to_write=True)[source]

Bases: object

Descriptor class that delegates data access to particle data

Parameters:
  • name – Variable name as used within kernels
  • dtype – Data type (numpy.dtype) of the variable
  • initial – Initial value of the variable. Note that this can also be a Field object, which will then be sampled at the location of the particle
  • to_write – Boolean to control whether Variable is written to NetCDF file
is64bit()[source]

Check whether variable is 64-bit

parcels.kernels.advection module

Collection of pre-built advection kernels

parcels.kernels.advection.AdvectionRK4(particle, fieldset, time, dt)[source]

Advection of particles using fourth-order Runge-Kutta integration.

Function needs to be converted to Kernel object before execution

parcels.kernels.advection.AdvectionEE(particle, fieldset, time, dt)[source]

Advection of particles using Explicit Euler (aka Euler Forward) integration.

Function needs to be converted to Kernel object before execution

parcels.kernels.advection.AdvectionRK45(particle, fieldset, time, dt)[source]

Advection of particles using adadptive Runge-Kutta 4/5 integration.

Times-step dt is halved if error is larger than tolerance, and doubled if error is smaller than 1/10th of tolerance, with tolerance set to 1e-9 * dt by default.

parcels.kernels.advection.AdvectionRK4_3D(particle, fieldset, time, dt)[source]

Advection of particles using fourth-order Runge-Kutta integration including vertical velocity.

Function needs to be converted to Kernel object before execution

parcels.kernels.error module

Collection of pre-built recovery kernels

exception parcels.kernels.error.KernelError(particle, fieldset=None, msg=None)[source]

Bases: exceptions.RuntimeError

General particle kernel error with optional custom message

exception parcels.kernels.error.OutOfBoundsError(particle, fieldset=None, lon=None, lat=None, depth=None)[source]

Bases: parcels.kernels.error.KernelError

Particle kernel error for out-of-bounds field sampling

parcels.codegenerator module

class parcels.codegenerator.IntrinsicTransformer(fieldset, ptype)[source]

Bases: ast.NodeTransformer

AST transformer that catches any mention of intrinsic variable names, such as ‘particle’ or ‘fieldset’, inserts placeholder objects and propagates attribute access.

get_tmp()[source]

Create a new temporary veriable name

visit_Name(node)[source]

Inject IntrinsicNode objects into the tree according to keyword

class parcels.codegenerator.KernelGenerator(fieldset, ptype)[source]

Bases: ast.NodeVisitor

Code generator class that translates simple Python kernel functions into C functions by populating and accessing the ccode attriibute on nodes in the Python AST.

visit_Call(node)[source]

Generate C code for simple C-style function calls. Please note that starred and keyword arguments are currently not supported.

visit_FieldNode(node)[source]

Record intrinsic fields used in kernel

visit_Name(node)[source]

Catches any mention of intrinsic variable names, such as ‘particle’ or ‘fieldset’ and inserts our placeholder objects

class parcels.codegenerator.LoopGenerator(fieldset, ptype=None)[source]

Bases: object

Code generator class that adds type definitions and the outer loop around kernel functions to generate compilable C code.

class parcels.codegenerator.TupleSplitter[source]

Bases: ast.NodeTransformer

AST transformer that detects and splits Pythonic tuple assignments into multiple statements for conversion to C.

parcels.compiler module

class parcels.compiler.Compiler(cc, ld=None, cppargs=[], ldargs=[])[source]

Bases: object

A compiler object for creating and loading shared libraries.

Parameters:
  • cc – C compiler executable (can be overriden by exporting the environment variable CC).
  • ld – Linker executable (optional, if None, we assume the compiler can build object files and link in a single invocation, can be overridden by exporting the environment variable LDSHARED).
  • cppargs – A list of arguments to the C compiler (optional).
  • ldargs – A list of arguments to the linker (optional).
class parcels.compiler.GNUCompiler(cppargs=[], ldargs=[])[source]

Bases: parcels.compiler.Compiler

A compiler object for the GNU Linux toolchain.

Parameters:
  • cppargs – A list of arguments to pass to the C compiler (optional).
  • ldargs – A list of arguments to pass to the linker (optional).

parcels.kernel module

class parcels.kernel.Kernel(fieldset, ptype, pyfunc=None, funcname=None, funccode=None, py_ast=None, funcvars=None)[source]

Bases: object

Kernel object that encapsulates auto-generated code.

Parameters:
  • fieldset – FieldSet object providing the field information
  • ptype – PType object for the kernel particle

Note: A Kernel is either created from a compiled <function …> object or the necessary information (funcname, funccode, funcvars) is provided. The py_ast argument may be derived from the code string, but for concatenation, the merged AST plus the new header definition is required.

compile(compiler)[source]

Writes kernel code to file and compiles it.

execute(pset, endtime, dt, recovery=None)[source]

Execute this Kernel over a ParticleSet for several timesteps

execute_jit(pset, endtime, dt)[source]

Invokes JIT engine to perform the core update loop

execute_python(pset, endtime, dt)[source]

Performs the core update loop via Python

parcels.particlefile module

Module controlling the writing of ParticleSets to NetCDF file

class parcels.particlefile.ParticleFile(name, particleset, type='array')[source]

Initialise netCDF4.Dataset for trajectory output.

The output follows the format outlined in the Discrete Sampling Geometries section of the CF-conventions: http://cfconventions.org/cf-conventions/v1.6.0/cf-conventions.html#discrete-sampling-geometries

The current implementation is based on the NCEI template: http://www.nodc.noaa.gov/data/formats/netcdf/v2.0/trajectoryIncomplete.cdl

Both ‘Orthogonal multidimensional array’ and ‘Indexed ragged array’ representation are implemented. The former is simpler to post-process, but the latter is required when particles will be added during the .execute (i.e. the number of particles in the pset increases).

Developer note: We cannot use xray.Dataset here, since it does not yet allow incremental writes to disk: https://github.com/pydata/xarray/issues/199

Parameters:
  • name – Basename of the output file
  • particleset – ParticleSet to output
  • user_vars – A list of additional user defined particle variables to write
  • type – Either ‘array’ for default matrix style, or ‘indexed’ for indexed ragged array
sync()[source]

Write all buffered data to disk

write(pset, time, sync=True)[source]

Write parcels.particleset.ParticleSet data to file

Parameters:
  • pset – ParticleSet object to write
  • time – Time at which to write ParticleSet
  • sync – Optional argument whether to write data to disk immediately. Default is True

parcels.rng module

parcels.rng.seed(seed)[source]

Sets the seed for parcels internal RNG

parcels.rng.random()[source]

Returns a random float between 0. and 1.

parcels.rng.uniform(low, high)[source]

Returns a random float between low and high

parcels.rng.randint(low, high)[source]

Returns a random int between low and high

parcels.rng.normalvariate(loc, scale)[source]

Returns a random float on normal distribution with mean loc and width scale

parcels.loggers module

Script to create a logger for Parcels

scripts.plotParticles module

scripts.plotParticles.plotTrajectoriesFile(filename, mode='2d', tracerfile=None, tracerfield='P', tracerlon='x', tracerlat='y', recordedvar=None, show_plt=True)[source]

Quick and simple plotting of Parcels trajectories

Parameters:
  • filename – Name of Parcels-generated NetCDF file with particle positions
  • mode – Type of plot to show. Supported are ‘2d’, ‘3d’ ‘movie2d’ and ‘movie2d_notebook’. The latter two give animations, with ‘movie2d_notebook’ specifically designed for jupyter notebooks
  • tracerfile – Name of NetCDF file to show as background
  • tracerfield – Name of variable to show as background
  • tracerlon – Name of longitude dimension of variable to show as background
  • tracerlat – Name of latitude dimension of variable to show as background
  • recordedvar – Name of variable used to color particles in scatter-plot. Only works in ‘movie2d’ or ‘movie2d_notebook’ mode.
  • show_plt – Boolean whether plot should directly be show (for py.test)

scripts.get_examples module

class scripts.pull_data.ExampleData(url, filenames, path)
download()

Function used to download example data from http://oceanparcels.org/examples-data