Design notes

Pipelines

Internally S1 Tiling defines a series of pipelines. Actually, it distinguishes pipeline descriptions from actual pipelines. The actual pipelines are generated from their description and input files, and handled internally; they won’t be described.

Each pipeline corresponds to a series of processings. The intended and original design is to have a direct match: one processing == one OTB application, and to permit to chain OTB applications in memory through OTB Python bindings.

Actually, a processing doesn’t always turn into the execution of an OTB application, sometimes we need to do other computations.

When we need to have files produced at some point, we end a pipeline, the next one(s) can take over from that point.

Simple pipelines

In simple cases, we can chain the output of an in-memory pipeline of OTB applications into the next pipeline.

At this moment, the following sequence of pipelines is defined:

pipelines = PipelineDescriptionSequence(config)
pipelines.register_pipeline([AnalyseBorders, Calibrate, CutBorders], 'PrepareForOrtho', product_required=False)
pipelines.register_pipeline([OrthoRectify],                          'OrthoRectify',    product_required=False)
pipelines.register_pipeline([Concatenate],                                              product_required=True)
if config.mask_cond:
    pipelines.register_pipeline([BuildBorderMask, SmoothBorderMask], 'GenerateMask',    product_required=True)

For instance, to minimize disk usage, we could chain in-memory orthorectification directly after the border cutting by removing the second pipeline, and by registering the following step into the first pipeline instead:

pipelines.register_pipeline([AnalyseBorders, Calibrate, CutBorders, OrthoRectify],
                            'OrthoRectify', product_required=False)

Complex pipelines

In more complex cases, the product of a pipeline will be used as input of several other pipelines. Also a pipelines can have several inputs coming from different other pipelines.

To do so, we name each pipeline, so we can use that name as input of other pipelines.

For instance, LIA producing pipelines are described this way

pipelines = PipelineDescriptionSequence(config, dryrun=dryrun)
dem = pipelines.register_pipeline([AgglomerateDEM],
    'AgglomerateDEM',
    inputs={'insar': 'basename'})
demproj = pipelines.register_pipeline([ExtractSentinel1Metadata, SARDEMProjection],
    'SARDEMProjection',
    is_name_incremental=True,
    inputs={'insar': 'basename', 'indem': dem})
xyz = pipelines.register_pipeline([SARCartesianMeanEstimation],
    'SARCartesianMeanEstimation',
    inputs={'insar': 'basename', 'indem': dem, 'indemproj': demproj})
lia = pipelines.register_pipeline([ComputeNormals, ComputeLIA],
    'Normals|LIA',
    is_name_incremental=True,
    inputs={'xyz': xyz})

# "inputs" parameter doesn't need to be specified in all the following
# pipeline declarations but we still use it for clarity!
ortho  = pipelines.register_pipeline([filter_LIA('LIA'), OrthoRectifyLIA],
    'OrthoLIA',
    inputs={'in': lia},
    is_name_incremental=True)
concat = pipelines.register_pipeline([ConcatenateLIA],
    'ConcatLIA',
    inputs={'in': ortho})
select = pipelines.register_pipeline([SelectBestCoverage],
    'SelectLIA',
    product_required=True,
    inputs={'in': concat})
ortho_sin  = pipelines.register_pipeline([filter_LIA('sin_LIA'), OrthoRectifyLIA],
    'OrthoSinLIA',
    inputs={'in': lia},
    is_name_incremental=True)
concat_sin = pipelines.register_pipeline([ConcatenateLIA],
    'ConcatSinLIA',
    inputs={'in': ortho_sin})
select_sin = pipelines.register_pipeline([SelectBestCoverage],
    'SelectSinLIA',
    product_required=True,
    inputs={'in': concat_sin})

Dask: tasks

Given pipeline descriptions, a requested S2 tile and its intersecting S1 images, S1 Tiling builds a set of dependant Dask tasks. Each task corresponds to an actual pipeline which will transform a given image into another named image product.

Processing Classes

Again the processing classes are split in two families:

Step Factories

Inheritance diagram of s1tiling.libs.otbpipeline.OTBStepFactory, s1tiling.libs.otbpipeline.ExecutableStepFactory, s1tiling.libs.otbpipeline._FileProducingStepFactory, s1tiling.libs.otbpipeline.Store

StepFactory

class s1tiling.libs.otbpipeline.StepFactory(name, *unused_argv, **kwargs)[source]

Bases: abc.ABC

Abstract factory for Step

Meant to be inherited for each possible OTB application or external application used in a pipeline.

Sometimes we may also want to add some artificial steps that analyse products, filenames…, or step that help filter products for following pipelines.

See: Existing processings

_update_filename_meta_pre_hook(meta)[source]

Hook meant to be overridden to complete product metadata before they are used to produce filenames or tasknames.

Called from update_filename_meta()

_update_filename_meta_post_hook(meta)[source]

Hook meant to be overridden to fix product metadata by overriding their default definition.

Called from update_filename_meta()

build_step_output_filename(meta)[source]

Filename of the step output.

See also build_step_output_tmp_filename() regarding the actual processing.

build_step_output_tmp_filename(meta)[source]

Returns a filename to a temporary file to use in output of the current application.

When an OTB (/External) application is harshly interrupted (crash or user interruption), it leaves behind an incomplete (and thus invalid) file. In order to ignore those files when a pipeline is restarted, an temporary filename is used by the application. Once the application exits with success, the file will be renamed into build_step_output_filename(), and possibly moved into _FileProducingStepFactory.output_directory() if this is a final product.

check_requirements()[source]

Abstract method used to test whether a StepFactory has all its external requirements fulfilled. For instance, OTBStepFactory’s will check their related OTB application can be executed.

Returns:None if requirements are fulfilled.
Returns:A message indicating what is missing otherwise, and some context how to fix it.
complete_meta(meta, all_inputs)[source]

Duplicates, completes, and return, the meta dictionary with specific information for the current factory regarding Step instanciation.

create_step(in_memory: bool, previous_steps)[source]

Instanciates the step related to the current StepFactory, that consumes results from the previous input steps.

1. This methods starts by updating metadata information through complete_meta() on the input metadatas.

2. in case the new step isn’t related to an OTB application, nothing specific is done, we’ll just return an AbstractStep

Note: While previous_steps is ignored in this specialization, it’s used in Store.create_step() where it’s eventually used to release all OTB Application objects.

image_description

Property image_description, used to fill TIFFTAG_IMAGEDESCRIPTION

name

Step Name property.

update_filename_meta(meta)[source]

Duplicates, completes, and return, the meta dictionary with specific information for the current factory regarding tasks analysis.

This method is used:

  • while analysing the dependencies to build the task graph – in this use case the relevant information are the file names and paths.
  • and indirectly before instanciating a new Step

Other metadata not filled here:

  • get_task_name() which is deduced from out_filename by default
  • out_extended_filename_complement()

It’s possible to inject some other metadata (that could be used from _get_canonical_input() for instance) thanks to _update_filename_meta_pre_hook().

update_image_metadata(meta, all_inputs)[source]

Root implementation of update_image_metadata() that shall be specialized in every file producing Step Factory.

_FileProducingStepFactory

class s1tiling.libs.otbpipeline._FileProducingStepFactory(cfg, gen_tmp_dir, gen_output_dir, gen_output_filename, *argv, **kwargs)[source]

Bases: s1tiling.libs.otbpipeline.StepFactory

Abstract class that factorizes filename transformations and parameter handling for Steps that produce files, either with OTB or through external calls.

create_step() is kind of abstract at this point.

__init__(cfg, gen_tmp_dir, gen_output_dir, gen_output_filename, *argv, **kwargs)[source]

Constructor

See output_directory(), tmp_directory(), build_step_output_filename() and build_step_output_tmp_filename() for the usage of gen_tmp_dir, gen_output_dir and gen_output_filename.

build_step_output_filename(meta)[source]

Returns the names of typical result files in case their production is required (i.e. not in-memory processing).

This specialization uses gen_output_filename naming policy parameter to build the output filename. See the Available naming policies.

build_step_output_tmp_filename(meta)[source]

This specialization of StepFactory.build_step_output_tmp_filename() will automatically insert .tmp before the filename extension.

output_directory(meta)[source]

Accessor to where output files will be stored in case their production is required (i.e. not in-memory processing)

This property is built from gen_output_dir construction parameter. Typical values for the parameter are:

  • os.path.join(cfg.output_preprocess, '{tile_name}'), where tile_name is looked into meta parameter
  • None, in that case the result will be the same as tmp_directory(). This case will make sense for steps that don’t produce required products
parameters(meta)[source]

Most steps that produce files will expect parameters.

Warning: In ExecutableStepFactory, parameters that designate output filenames are expected to use tmp_filename() and not out_filename(). Indeed products are meant to be first produced with temporary names before being renamed with their final names, once the operation producing them has succeeded.

Note: This method is kind-of abstract – SelectBestCoverage is a _FileProducingStepFactory but, it doesn’t actualy consume parameters.

ram_per_process

Property ram_per_process

tmp_directory(meta)[source]

Directory used to store temporary files before they are renamed into their final version.

This property is built from gen_tmp_dir construction parameter. Typical values for the parameter are:

  • os.path.join(cfg.tmpdir, 'S1')
  • os.path.join(cfg.tmpdir, 'S2', '{tile_name}') where tile_name is looked into meta parameter

OTBStepFactory

class s1tiling.libs.otbpipeline.OTBStepFactory(cfg, appname, gen_tmp_dir, gen_output_dir, gen_output_filename, *argv, **kwargs)[source]

Bases: s1tiling.libs.otbpipeline._FileProducingStepFactory

Abstract StepFactory for all OTB Applications.

All step factories that wrap OTB applications are meant to inherit from OTBStepFactory.

__init__(cfg, appname, gen_tmp_dir, gen_output_dir, gen_output_filename, *argv, **kwargs)[source]

Constructor.

See:
_FileProducingStepFactory.__init__()
Parameters:
param_in:Flag used by the default OTB application for the input file (default: “in”)
param_out:Flag used by the default OTB application for the ouput file (default: “out”)
appname

OTB Application property.

check_requirements()[source]

This specialization of check_requirements() checks whether the related OTB application can correctly be executed from S1Tiling.

Returns:A pair of the message indicating what is required, and some context how to fix it – by default: install OTB!
Returns:None otherwise.
create_step(in_memory: bool, previous_steps)[source]

Instanciates the step related to the current StepFactory, that consumes results from the previous input step.

1. This methods starts by updating metadata information through complete_meta() on the input metadata.

2. Then, steps that wrap an OTB application will instanciate this application object, and:

  • either pipe the new application to the one from the input step if it wasn’t a first step
  • or fill in the “in” parameter of the application with the out_filename() of the input step.

2-bis. in case the new step isn’t related to an OTB application, nothing specific is done, we’ll just return an AbstractStep

Note: While previous_steps is ignored in this specialization, it’s used in Store.create_step() where it’s eventually used to release all OTB Application objects.

Note: it’s possible to override this method to return no step (None). In that case, no OTB Application would be registered in the actual Pipeline.

param_in

Name of the “in” parameter used by the OTB Application. Default is likely to be “in”, while some applications use “io.in”, often “il” for list of files…

param_out

Name of the “out” parameter used by the OTB Application. Default is likely to be “out”, whie some applications use “io.out”.

requirement_context()[source]

Return the requirement context that permits to fix missing requirements. By default, OTB applications requires… OTB!

set_output_pixel_type(app, meta)[source]

Permits to have steps force the output pixel data. Does nothing by default. Override this method to change the output pixel type.

ExecutableStepFactory

class s1tiling.libs.otbpipeline.ExecutableStepFactory(cfg, exename, gen_tmp_dir, gen_output_dir, gen_output_filename, *argv, **kwargs)[source]

Bases: s1tiling.libs.otbpipeline._FileProducingStepFactory

Abstract StepFactory for executing any external program.

All step factories that wrap OTB applications are meant to inherit from ExecutableStepFactory.

create_step(in_memory: bool, previous_steps)[source]

This Step creation method does more than just creating the step. It also executes immediately the external process.

Store

class s1tiling.libs.otbpipeline.Store(appname, *argv, **kwargs)[source]

Bases: s1tiling.libs.otbpipeline.StepFactory

Factory for Artificial Step that forces the result of the previous app sequence to be stored on disk by breaking in-memory connection.

While it could be used manually, it’s meant to be automatically appended at the end of a pipeline if any step is actually related to OTB.

build_step_output_filename(meta)[source]

Filename of the step output.

See also build_step_output_tmp_filename() regarding the actual processing.

build_step_output_tmp_filename(meta)[source]

Returns a filename to a temporary file to use in output of the current application.

When an OTB (/External) application is harshly interrupted (crash or user interruption), it leaves behind an incomplete (and thus invalid) file. In order to ignore those files when a pipeline is restarted, an temporary filename is used by the application. Once the application exits with success, the file will be renamed into build_step_output_filename(), and possibly moved into _FileProducingStepFactory.output_directory() if this is a final product.

create_step(in_memory: bool, previous_steps)[source]

Specializes StepFactory.create_step() to trigger StoreStep.execute_and_write_output() on the last step that relates to an OTB Application.

Steps

Inheritance diagram of s1tiling.libs.otbpipeline.Step, s1tiling.libs.otbpipeline.FirstStep, s1tiling.libs.otbpipeline.ExecutableStep, s1tiling.libs.otbpipeline.MergeStep, s1tiling.libs.otbpipeline.StoreStep, s1tiling.libs.otbpipeline._StepWithOTBApplication

Step types are usually instantiated automatically.

  • FirstStep is instantiated automatically by the program from existing files (downloaded, or produced by a pipeline earlier in the sequence of pipelines)
  • MergeStep is also instantiated automatically as an alternative to FirstStep in the case of steps that expect several input files of the same type. This is for instance the case of Concatenate inputs. A step is recognized to await several inputs when the dependency analysis phase found several possible inputs that lead to a product.
  • Step is the main class for steps that execute an OTB application.
  • ExecutableStep is the main class for steps that execute an external application.
  • AbstractStep is the root class of steps hierarchy. It still get instantiated automatically for steps not related to any kind of application.

AbstractStep

class s1tiling.libs.otbpipeline.AbstractStep(*unused_argv, **kwargs)[source]

Bases: object

Internal root class for all actual Step s.

There are several kinds of steps:

  • FirstStep that contains information about input files
  • Step that registers an otbapplication binding
  • StoreStep that momentarilly disconnect on-memory pipeline to force storing of the resulting file.
  • ExecutableStep that executes external applications
  • MergeStep that operates a rendez-vous between several steps producing files of a same kind.

The step will contain information like the current input file, the current output file…

basename

Basename property will be used to generate all future output filenames.

clean_cache()[source]

Takes care or removing intermediary files once we know they are no longer required like the orthorectified subtiles once the concatenation has been done.

is_first_step

Tells whether this step is the first of a pipeline.

meta

Step meta data property.

out_filename

Property that returns the name of the file produced by the current step.

release_app()[source]

Makes sure that steps with applications are releasing the application

shall_store

No OTB related step requires its result to be stored on disk and to break in_memory connection by default.

However, the artificial Step produced by Store factory will force the result of the previous application(s) to be stored on disk.

FirstStep

class s1tiling.libs.otbpipeline.FirstStep(*argv, **kwargs)[source]

Bases: s1tiling.libs.otbpipeline.AbstractStep

First Step:

  • no application executed
input_metas

Specific to MergeStep and FirstStep: returns the metas from the inputs as a list.

_StepWithOTBApplication

class s1tiling.libs.otbpipeline._StepWithOTBApplication(app, *argv, **kwargs)[source]

Bases: s1tiling.libs.otbpipeline.AbstractStep

Internal intermediary type for Steps that have an application object. Not meant to be used directly.

Parent type for:

  • Step that will own the application
  • and StoreStep that will just reference the application from the previous step
app

OTB Application property.

is_first_step

Tells whether this step is the first of a pipeline.

param_out

Name of the “out” parameter used by the OTB Application. Default is likely to be “out”, while some applications use “io.out”.

release_app()[source]

Makes sure that steps with applications are releasing the application

Step

class s1tiling.libs.otbpipeline.Step(app, *argv, **kwargs)[source]

Bases: s1tiling.libs.otbpipeline._StepWithOTBApplication

Interal specialized Step that holds a binding to an OTB Application.

The application binding is expected to be built by a dedicated StepFactory and passed to the constructor.

release_app()[source]

Makes sure that steps with applications are releasing the application

MergeStep

class s1tiling.libs.otbpipeline.MergeStep(input_steps_metas, *argv, **kwargs)[source]

Bases: s1tiling.libs.otbpipeline.AbstractStep

Kind of FirstStep that merges the result of one or several other steps of same kind.

Used in input of Concatenate

  • no application executed
input_metas

Specific to MergeStep and FirstStep: returns the metas from the inputs as a list.

ExecutableStep

class s1tiling.libs.otbpipeline.ExecutableStep(exename, *argv, **kwargs)[source]

Bases: s1tiling.libs.otbpipeline.AbstractStep

Generic step for calling any external application.

execute_and_write_output(parameters)[source]

Actually execute the external program. While the program runs, a temporary filename will be used as output. On successful execution, the output will be renamed to match its expected final name.

tmp_filename

Property that returns the name of the file produced by the current step while the external application is running. Eventually, it’ll get renamed into AbstractStep.out_filename() if the application succeeds.

Existing processings

The domain processings are defined through StepFactory subclasses, which in turn will instantiate domain unaware subclasses of AbstractStep for the actual processing.

Main processings

ExtractSentinel1Metadata
class s1tiling.libs.otbwrappers.ExtractSentinel1Metadata(cfg)[source]

Bases: s1tiling.libs.otbpipeline.StepFactory

Factory that takes care of extracting meta data from S1 input files.

Note: At this moment it needs to be used on a separate pipeline to make sure the meta is updated when calling PipelineDescription.expected().

build_step_output_filename(meta)[source]

Forward the output filename.

build_step_output_tmp_filename(meta)[source]

As there is no OTB application associated to ExtractSentinel1Metadata, there is no temporary filename.

complete_meta(meta, all_inputs)[source]

Complete meta information with inputs

update_image_metadata(meta, all_inputs)[source]

Set the common and root information that’ll get carried around.

AnalyseBorders
class s1tiling.libs.otbwrappers.AnalyseBorders(cfg)[source]

Bases: s1tiling.libs.otbpipeline.StepFactory

StepFactory that analyses whether image borders need to be cut as described in Margins cutting documentation.

The step produced by this actual factory doesn’t register any OTB application nor execute one. However, it loads two lines from the input image to determine whether it contains too many NoData.

Found information will be stored into the meta dictionary for later use by CutBorders step factory.

build_step_output_filename(meta)[source]

Forward the output filename.

build_step_output_tmp_filename(meta)[source]

As there is no OTB application associated to AnalyseBorders, there is no temporary filename.

complete_meta(meta, all_inputs)[source]

Complete meta information with Cutting thresholds.

Calibrate
class s1tiling.libs.otbwrappers.Calibrate(cfg)[source]

Bases: s1tiling.libs.otbpipeline.OTBStepFactory

Factory that prepares steps that run SARCalibration as described in SAR Calibration documentation.

Requires the following information from the configuration object:

  • ram_per_process
  • calibration_type
  • removethermalnoise

Requires the following information from the metadata dictionary:

  • base name – to generate typical output filename
  • input filename
  • output filename
parameters(meta)[source]

Returns the parameters to use with SARCalibration OTB application.

update_image_metadata(meta, all_inputs)[source]

Set calibration related information that’ll get carried around.

CutBorders
class s1tiling.libs.otbwrappers.CutBorders(cfg)[source]

Bases: s1tiling.libs.otbpipeline.OTBStepFactory

Factory that prepares steps that run ResetMargin as described in Margins cutting documentation.

Requires the following information from the configuration object:

  • ram_per_process

Requires the following information from the metadata dictionary

  • base name – to generate typical output filename
  • input filename
  • output filename
  • cut->`threshold.x` – from AnalyseBorders
  • cut->`threshold.y.start` – from AnalyseBorders
  • cut->`threshold.y.end` – from AnalyseBorders
parameters(meta)[source]

Returns the parameters to use with ResetMargin OTB application.

OrthoRectify
class s1tiling.libs.otbwrappers.OrthoRectify(cfg)[source]

Bases: s1tiling.libs.otbwrappers._OrthoRectifierFactory

Factory that prepares steps that run OrthoRectification as described in Orthorectification documentation.

Requires the following information from the configuration object:

  • ram_per_process
  • out_spatial_res
  • GeoidFile
  • grid_spacing
  • tmp_srtm_dir

Requires the following information from the metadata dictionary

  • base name – to generate typical output filename
  • input filename
  • output filename
  • manifest
  • tile_name
  • tile_origin
Concatenate
class s1tiling.libs.otbwrappers.Concatenate(cfg)[source]

Bases: s1tiling.libs.otbwrappers._ConcatenatorFactory

Abstract factory that prepares steps that run Synthetize as described in Concatenation documentation.

Requires the following information from the configuration object:

  • ram_per_process

Requires the following information from the metadata dictionary

  • input filename
  • output filename
update_out_filename(meta, with_task_info)[source]

This hook will be triggered everytime a new compatible input is added. The effect is quite unique to Concatenate as the name of the output product depends on the number of inputs are their common acquisition date.

BuildBorderMask
class s1tiling.libs.otbwrappers.BuildBorderMask(cfg)[source]

Bases: s1tiling.libs.otbpipeline.OTBStepFactory

Factory that prepares the first step that generates border maks as described in Border mask generation documentation.

Requires the following information from the configuration object:

  • ram_per_process

Requires the following information from the metadata dictionary

  • input filename
  • output filename
parameters(meta)[source]

Returns the parameters to use with BandMath OTB application for computing border mask.

set_output_pixel_type(app, meta)[source]

Force the output pixel type to UINT8.

SmoothBorderMask
class s1tiling.libs.otbwrappers.SmoothBorderMask(cfg)[source]

Bases: s1tiling.libs.otbpipeline.OTBStepFactory

Factory that prepares the first step that smoothes border maks as described in Border mask generation documentation.

Requires the following information from the configuration object:

  • ram_per_process

Requires the following information from the metadata dictionary

  • input filename
  • output filename
parameters(meta)[source]

Returns the parameters to use with BinaryMorphologicalOperation OTB application to smooth border masks.

set_output_pixel_type(app, meta)[source]

Force the output pixel type to UINT8.

SpatialDespeckle
class s1tiling.libs.otbwrappers.SpatialDespeckle(cfg)[source]

Bases: s1tiling.libs.otbpipeline.OTBStepFactory

Factory that prepares the first step that smoothes border maks as described in Spatial despeckle filtering documentation.

Requires the following information from the configuration object:

  • ram_per_process
  • fname_fmt_filtered
  • filter: the name of the filter method
  • rad: the filter windows radius
  • nblooks: the number of looks
  • deramp: the deramp factor

Requires the following information from the metadata dictionary

  • input filename
  • output filename
  • the keys used to generate the filename: flying_unit_code, tile_name, orbit_direction, orbit, calibration_type, acquisition_stamp, polarisation
complete_meta(meta, all_inputs)[source]

Complete meta information with inputs, and set compression method to DEFLATE.

parameters(meta)[source]

Returns the parameters to use with Despeckle OTB application to perform speckle noise reduction.

update_image_metadata(meta, all_inputs)[source]

Set despeckling related information that’ll get carried around.

Processings for advanced calibration

These processings permit to produce Local Incidence Angles Maps for σ0NORMLIM calibration.

AgglomerateDEM
class s1tiling.libs.otbwrappers.AgglomerateDEM(cfg, *args, **kwargs)[source]

Bases: s1tiling.libs.otbpipeline.ExecutableStepFactory

Factory that produces a Step that build a VRT from a list of DEM files.

The choice has been made to name the VRT file after the basename of the root S1 product and not the names of the DEM tiles.

complete_meta(meta, all_inputs)[source]

Factory that takes care of extracting meta data from S1 input files.

parameters(meta)[source]

Most steps that produce files will expect parameters.

Warning: In ExecutableStepFactory, parameters that designate output filenames are expected to use tmp_filename() and not out_filename(). Indeed products are meant to be first produced with temporary names before being renamed with their final names, once the operation producing them has succeeded.

Note: This method is kind-of abstract – SelectBestCoverage is a _FileProducingStepFactory but, it doesn’t actualy consume parameters.

SARDEMProjection
class s1tiling.libs.otbwrappers.SARDEMProjection(cfg)[source]

Bases: s1tiling.libs.otbpipeline.OTBStepFactory

Factory that prepares steps that run SARDEMProjection as described in Normals computation documentation.

SARDEMProjection application puts a DEM file into SAR geometry and estimates two additional coordinates. For each point of the DEM input four components are calculated: C (colunm into SAR image), L (line into SAR image), Z and Y. XYZ cartesian components into projection are also computed for our needs.

Requires the following information from the configuration object:

  • ram_per_process

Requires the following information from the metadata dictionary

  • input filename
  • output filename

It also requires $OTB_GEOID_FILE to be set in order to ignore any DEM information already registered in dask worker (through OrthoRectification for instance) and only use the Geoid.

add_image_metadata(meta, app)[source]

Post-application hook used to complete GDAL metadata.

As update_image_metadata() is not designed to access OTB application information (directiontoscandeml…), we need this extra hook to fetch and propagate the PRJ information.

complete_meta(meta, all_inputs)[source]

Complete meta information with hook for updating image metadata w/ directiontoscandemc, directiontoscandeml and gain.

parameters(meta)[source]

Returns the parameters to use with SARDEMProjection OTB application to project S1 geometry onto DEM tiles.

requirement_context()[source]

Return the requirement context that permits to fix missing requirements. SARDEMProjection comes from DiapOTB.

update_image_metadata(meta, all_inputs)[source]

Set SARDEMProjection related information that’ll get carried around.

SARCartesianMeanEstimation
class s1tiling.libs.otbwrappers.SARCartesianMeanEstimation(cfg)[source]

Bases: s1tiling.libs.otbpipeline.OTBStepFactory

Factory that prepares steps that run SARCartesianMeanEstimation as described in Normals computation documentation.

SARCartesianMeanEstimation estimates a simulated cartesian mean image thanks to a DEM file.

Requires the following information from the configuration object:

  • ram_per_process

Requires the following information from the metadata dictionary

  • input filename
  • output filename

Note: It cannot be chained in memory because of the directiontoscandem* parameters.

complete_meta(meta, all_inputs)[source]

Complete meta information with hook for updating image metadata w/ directiontoscandemc, directiontoscandeml and gain.

fetch_direction(inputpath, meta)[source]

Extract back direction to scan DEM from SARDEMProjected image metadata.

parameters(meta)[source]

Returns the parameters to use with SARCartesianMeanEstimation OTB application to compute cartesian coordinates of each point of the origin S1 image.

requirement_context()[source]

Return the requirement context that permits to fix missing requirements. SARCartesianMeanEstimation comes from DiapOTB.

update_image_metadata(meta, all_inputs)[source]

Set SARCartesianMeanEstimation related information that’ll get carried around.

ComputeNormals
class s1tiling.libs.otbwrappers.ComputeNormals(cfg)[source]

Bases: s1tiling.libs.otbpipeline.OTBStepFactory

Factory that prepares steps that run ExtractNormalVector as described in Normals computation documentation.

ExtractNormalVector computes surface normals.

Requires the following information from the configuration object:

  • ram_per_process

Requires the following information from the metadata dictionary

  • input filename
  • output filename
complete_meta(meta, all_inputs)[source]

Override complete_meta() to inject files to remove

parameters(meta)[source]

Returns the parameters to use with ExtractNormalVector OTB application to generate surface normals for each point of the origin S1 image.

requirement_context()[source]

Return the requirement context that permits to fix missing requirements. ComputeNormals comes from normlim_sigma0.

ComputeLIA
class s1tiling.libs.otbwrappers.ComputeLIA(cfg)[source]

Bases: s1tiling.libs.otbpipeline.OTBStepFactory

Factory that prepares steps that run SARComputeLocalIncidenceAngle as described in Normals computation documentation.

SARComputeLocalIncidenceAngle computes Local Incidende Angle Map.

Requires the following information from the configuration object:

  • ram_per_process

Requires the following information from the metadata dictionary

  • input filename
  • output filename
complete_meta(meta, all_inputs)[source]

Complete meta information with inputs

parameters(meta)[source]

Returns the parameters to use with SARComputeLocalIncidenceAngle OTB application.

requirement_context()[source]

Return the requirement context that permits to fix missing requirements. ComputeLIA comes from normlim_sigma0.

filter_LIA()
s1tiling.libs.otbwrappers.filter_LIA(LIA_kind)[source]

Generates a new StepFactory class that filters which LIA product shall be processed: LIA maps or sin LIA maps.

class s1tiling.libs.otbwrappers._FilterStepFactory(name, *unused_argv, **kwargs)[source]

Bases: s1tiling.libs.otbpipeline.StepFactory

Helper root class for all LIA/sin filtering steps.

This class will be specialized on the fly by filter_LIA() which will inject the static data _LIA_kind.

build_step_output_filename(meta)[source]

Forward the output filename.

build_step_output_tmp_filename(meta)[source]

As there is no OTB application associated to ExtractSentinel1Metadata, there is no temporary filename.

OrthoRectifyLIA
class s1tiling.libs.otbwrappers.OrthoRectifyLIA(cfg)[source]

Bases: s1tiling.libs.otbwrappers._OrthoRectifierFactory

Factory that prepares steps that run OrthoRectification on LIA maps.

Requires the following information from the configuration object:

  • ram_per_process
  • out_spatial_res
  • GeoidFile
  • grid_spacing
  • tmp_srtm_dir

Requires the following information from the metadata dictionary

  • base name – to generate typical output filename
  • input filename
  • output filename
  • manifest
  • tile_name
  • tile_origin
set_output_pixel_type(app, meta)[source]

Force LIA output pixel type to INT8.

update_image_metadata(meta, all_inputs)[source]

Set LIA kind related information that’ll get carried around.

ConcatenateLIA
class s1tiling.libs.otbwrappers.ConcatenateLIA(cfg)[source]

Bases: s1tiling.libs.otbwrappers._ConcatenatorFactory

Factory that prepares steps that run Synthetize on LIA images.

Requires the following information from the configuration object:

  • ram_per_process

Requires the following information from the metadata dictionary

  • input filename
  • output filename
set_output_pixel_type(app, meta)[source]

Force LIA output pixel type to INT8.

update_image_metadata(meta, all_inputs)[source]

Update concatenated LIA related information that’ll get carried around.

update_out_filename(meta, with_task_info)[source]

Unlike usual Concatenate, the output filename will always ends in “txxxxxx”.

However we want to update the coverage of the current pair as a new input file has been registered.

TODO: Find a better name for the hook as it handles two different services.

SelectBestCoverage
class s1tiling.libs.otbwrappers.SelectBestCoverage(cfg)[source]

Bases: s1tiling.libs.otbpipeline._FileProducingStepFactory

StepFactory that helps select only one path after LIA concatenation: the one that have the best coverage of the S2 tile target.

If several concatenated products have the same coverage, the oldest one will be selected.

The coverage is extracted from tile_coverage step metadata.

The step produced does nothing: it only only rename the selected product into the final expected name. Note: in LIA case two files will actually renamed.

create_step(in_memory: bool, previous_steps)[source]

Instanciates the step related to the current StepFactory, that consumes results from the previous input steps.

1. This methods starts by updating metadata information through complete_meta() on the input metadatas.

2. in case the new step isn’t related to an OTB application, nothing specific is done, we’ll just return an AbstractStep

Note: While previous_steps is ignored in this specialization, it’s used in Store.create_step() where it’s eventually used to release all OTB Application objects.

ApplyLIACalibration
class s1tiling.libs.otbwrappers.ApplyLIACalibration(cfg)[source]

Bases: s1tiling.libs.otbpipeline.OTBStepFactory

Factory that concludes σ0 with NORMLIM calibration.

It builds steps that multiply images calibrated with β0 LUT, and orthorectified to S2 grid, with the sin(LIA) map for the same S2 tile (and orbit number and direction).

Requires the following information from the configuration object:

  • ram_per_process

Requires the following information from the metadata dictionary

  • input filename
  • output filename
  • flying_unit_code
  • tile_name
  • polarisation
  • orbit_direction
  • orbit
  • acquisition_stamp
complete_meta(meta, all_inputs)[source]

Complete meta information with inputs, and set compression method to DEFLATE.

parameters(meta)[source]

Returns the parameters to use with BandMath OTB application for applying sin(LIA) to β0 calibrated image orthorectified to S2 tile.

update_image_metadata(meta, all_inputs)[source]

Set σ° normlim calibration related information that’ll get carried around.

Filename generation

At each step, product filenames are automatically generated by StepFactory.update_filename_meta function. This function is first used to generate the task execution graph. (It’s still used a second time, live, but this should change eventually)

The exact filename generation is handled by StepFactory.build_step_output_filename and StepFactory.build_step_output_tmp_filename functions to define the final filename and the working filename (used when the associated product is being computed).

In some very specific cases, where no product is generated, these functions need to be overridden. Otherwise, a default behaviour is proposed in _FileProducingStepFactory constructor. It is done through the parameters:

  • gen_tmp_dir: that defines where temporary files are produced.
  • gen_output_dir: that defines where final files are produced. When this parameter is left unspecified, the final product is considered to be a intermediary files and it will be stored in the temporary directory. The distinction is useful for final and required products.
  • gen_output_filename: that defines the naming policy for both temporary and final filenames.

Important

As the filenames are used to define the task execution graph, it’s important that every possible product (and associated production task) can be uniquely identified without any risk of ambiguity. Failure to comply will destabilise the data flows.

If for some reason you need to define a complex data flow where an output can be used several times as input in different Steps, or where a Step has several inputs of same or different kinds, or where several products are concurrent and only one would be selected, please check all StepFactories related to LIA dataflow.

Available naming policies

Inheritance diagram of s1tiling.libs.otbpipeline.ReplaceOutputFilenameGenerator, s1tiling.libs.otbpipeline.TemplateOutputFilenameGenerator, s1tiling.libs.otbpipeline.OutputFilenameGeneratorList

Three filename generators are available by default. They apply a transformation on the basename meta information.

ReplaceOutputFilenameGenerator
class s1tiling.libs.otbpipeline.ReplaceOutputFilenameGenerator(before_afters)[source]

Given a pair [text_to_search, text_to_replace_with], replace the exact matching text with new text in basename metadata.

TemplateOutputFilenameGenerator
class s1tiling.libs.otbpipeline.TemplateOutputFilenameGenerator(template)[source]

Given a template: "text{key1}_{another_key}_..", inject the metadata instead of the template keys.

Most filename format templates can be fine tuned to end-user ideal filenames. While the filenames used for intermediary products may be changed, it’s not recommended for data flow stability. See [Processing].fname_fmt.* for the short list of filenames meants to be adapted.

OutputFilenameGeneratorList
class s1tiling.libs.otbpipeline.OutputFilenameGeneratorList(generators)[source]

Some steps produce several products. This specialization permits to generate several output filenames.

It’s constructed from other filename generators.

Hooks

StepFactory._update_filename_meta_pre_hook

Sometimes it’s necessary to analyse the input files, and/or their names before being able to build the output filename(s). This is meant to be done by overriding StepFactory._update_filename_meta_pre_hook method. Lightweight analysing is meant to be done here, and its result can then be stored into meta dictionary, and returned.

It’s typically used alongside TemplateOutputFilenameGenerator.

StepFactory._update_filename_meta_post_hook

StepFactory.update_filename_meta provides various values to metadata. This hooks permits to override the values associated to task names, product existence tests, and so on.