1.1.2.2. NetCDF tools

The altimetry.tools.nctools module contains tools dedicated to easily handle NetCDF data.

1.1.2.2.1. An easy to use wrapper to NetCDF4 package - altimetry.tools.nctools.nc

class altimetry.tools.nctools.nc(limit=[-90.0, 0.0, 90.0, 360.0], verbose=0, zero_2pi=False, transpose=False, use_local_dims=False, **kwargs)[source]

A class for easy-handling of NetCDF data based on NetCDF4 package.

Example :

To load different sets of data, try these :

  • Simply load a NetCDF file

    • The file has standard dimensions (eg. called longitude & latitude)
    ncr=nc()
    data=ncr.read(file)
    
    lon=data.lon
    lat=data.lat
    Z=data.Z
    
    • We do not want to match for standard dimension names and keep original names
    ncr=nc()
    data=ncr.read(file,use_local_dims=True)
    
    lon=data.longitude
    lat=data.latitude
    Z=data.Z
    
    • We extract a region and depth range between 2 dates:
      • We extract between 30-40°N & 15-20°E (limit).
      • We extract between 100 & 200 m deep (depth).
      • We get data from 2010/01/01 to 2010/01/07 (time).
      • File has standard dimensions called longitude, latitude, level and time
    ncr=nc()
    limit=[30,15,40,20]
    depth=[100,200]
    time=[21915,21921]
    
    data=ncr.read(file,
                  limit=limit,
                  timerange=time,
                  depthrange=depth)
      
    lon=data.lon
    lat=data.lat
    dep=data.depth
    dat=data.time
    Z=data.Z
    
  • More sophisticated example using a file containing bathymetetry data

    • Load a file and extract a regions and subsample it to a lower resolution
      • The file has dimensions NbLongitudes & NbLatitudes.
      • We extract between 30-40°N & 15-20°E (limit).
      • We subsample every 3 points (stride).
    limit=[30,15,40,20]
    stride = (3,)
    ncr=nc(use_local_dims=True)
    bathy=ncr.load(file,
                   NbLongitudes=(limit[1],limit[3])+stride,
                   NbLatitudes=(limit[0],limit[2])+stride)
    
    • Then we save the data to another file (output).
    #save data
    bathy.write_nc(output)
    
    • We update the history global attribute of data structure
    #Get attribute structure
    attrStr=bathy.get('_attributes',{})
     
    #Get arguments called from the shell
    cmd=[os.path.basename(sys.argv[0])]
    for a in argv : cmd.append(a)
     
    #update attribute stucture (pop history and concatenate with current commands=.
    attrStr.update({'history':attrStr.pop('history','')+' '.join(cmd)+'\n'})
     
    #update NetCDF data structure
    bathy.update({'_attributes':attrStr})
    
    #save data
    bathy.write_nc(output)
    
    • We now want to flag all values from variable Z above 0 by setting them to fill_value and append this modified variable to the output file
    #load variable
    Z = bathy.Z
    
    #flag variable
    Z.mask[Z >= 0] = False
    
    #update attributes
    Z['_attributes']['long_name'] = 'flagged bathymetry'
    
    #append modified bathymetry to a variable named Z_2 in output file.
    bathy.push(output,'Z2',Z)
    
attributes(filename, **kwargs)[source]

Get attributes of a NetCDF file

:return {type:dict} outStr: Attribute structure. :author: Renaud Dussurget

count = None

number of files loaded

fileid = None

array of file IDs

limit = None

limits of the domain : [latmin,lonmin,latmax,lonmax] (default = [-90.,0.,90.,360.])

Note

limits are automatically reset using altimetry.tools.recale_limits()

load(filename, params=None, force=False, depthrange=None, timerange=None, output_is_dict=True, **kwargs)[source]

NetCDF data loader

Parameters:
  • filename – file name
  • params – a list of variables to load (default : load ALL variables).
  • depthrange – if a depth dimension is found, subset along this dimension.
  • timerange – if a time dimension is found, subset along this dimension.

Note

using altimetry.tools.nctools.limit allows subsetting to a given region.

Parameters:kwargs – additional arguments for subsetting along given dimensions.

Note

You can index along any dimension by providing the name of the dimensions to subsample along. Values associated to the provided keywords should be a length 2 or 3 tuple (min,max,<step>) (cf. altimetry.data.nctools.load_ncVar()).

Parameters:output_is_dict – data structures are dictionnaries (eg. my_hydro_data.variable[‘data’]). If false uses an object with attributes (eg. my_hydro_data.variable.data).

:return {type:dict} outStr: Output data structure containing all recorded parameters as specificied by NetCDF file PARAMETER list.

Author :Renaud Dussurget
message(MSG_LEVEL, str)[source]

print function wrapper. Print a message depending on the verbose level

Parameters:

MSG_LEVEL ({in}{required}{type=int}) – level of the message to be compared with self.verbose

Example :

display a message

self.log(0,'This message will be shown for any verbose level')
Author :

Renaud DUSSURGET (RD), LER PAC/IFREMER

Change :

Added a case for variables with missing dimensions

push(*args, **kwargs)[source]

append a variable from a given data structure to the existing dataset.

Parameters:
  • file (optional) –
  • name – variable name
  • value – data
  • start – broadcast the data to a portion of the dataset. starting index.
  • counts – broadcast the data to a portion of the dataset. number of counts.
  • stride – broadcast the data to a portion of the dataset. stepping along dimension.
read(file_pattern, **kwargs)[source]

Read data from a NetCDF file

Parameters:
size = None

length of the dataset

use_local_dims = None

this option prevent from trying to detect standard CF dimensions such longitude, latitude, time in the file and keep the original dimensions of the file

Note

Set this option to True when file is not standard (eg. not following CF conventions).

Note

Normal behaviour is to match dimensions (ie. a dimension and the associated variable of the same name) with specific names. Resulting variables associated with these dimensions will be called : * lon (longitudes) : matches dimensions starting with ‘lon’ * lat (latitudes) : matches dimensions starting with ‘lat’ * time (time) : matches dimensions starting with ‘date’ or ‘time’ * depth (date) : matches dimensions starting with ‘dep’ or ‘lev’

verbose = None

verbosity level on a scale of 0 (silent) to 4 (max verbosity)

write(data, outfile, clobber=False, format='NETCDF4')[source]

Write a netCDF file using a data structure.

Parameters:
  • data – data structure
  • outfile – output file
  • clobber – erase file if it already exists
  • format – NetCDF file format.

Note

the data structure requires a “_dimensions” field (dimension structure)

1.1.2.2.2. Addionnal function

altimetry.tools.nctools.load_ncVar(varName, nc=None, **kwargs)[source]

Loads a variable from the NetCDF file and saves it as a data structure.

Parameters:varName – variable name
Keywords kwargs:
 additional keyword arguments for slicing the dataset. Keywords should be named the name of the dimensions to subsample along and associated value should be a length 2 or 3 tuple (min,max,<step>).