Hook Service

A hook is a workflow you can trigger on demand via an API. DEDL lets you use the predefined hooks or create your own.

The Hook Service is a serverless execution environment for running container-based workflows on demand in the Destination Earth Data Lake. It lets you trigger ready-to-use functions and workflows through an API, without having to manage the underlying Kubernetes infrastructure.

You can either use the provided hooks, which are designed to efficiently access and manipulate the data, or build and run your own custom hooks, to support project-specific data processing and analysis needs.

How it works

The Hook Service is based on Argo Workflows, which runs on top of Kubernetes.

Both provided hooks and custom hooks run as containers and are triggered through a dedicated API provided by the platform.

Prerequisites

No. 1 Access to My DataLake Services in general

You need to have access to My DataLake Services in general (have a profile, create a project, invite users, etc.).

List of articles about My DataLake Services

No. 2 Access to Hook Services in particular

You must have access to Hook Service features in your My DataLake Services project.

How to request roles for hook service on My DataLake Services

No. 3 Access to the OnDemand Data Processing API

Interaction with the Hook Service is based on the OnDemand Data Processing API. As an Edge Service, you must have the necessary permissions to use this API.

For details about requesting the required access, see Access DestinE Edge Services.

No. 4 Use JupyterHub to run the examples

If you want to go through the examples and learn how to create your own custom hook, JupyterHub will be the main tool:

Run a notebook on JupyterHub

How to request for JupyterHub roles on My DataLake Services

Provided Hooks

Provided Hooks are workflows/functions that have been pre-developed and pre-deployed by Destination Earth Data Lake. A kind of serverless processing services ready to be used on demand.

Currently available provided hooks are:

Workflow Name

Display Name

data-harvest

Data Harvest

card_bs

Sentinel-1: Terrain-corrected backscatter

card_cohinf

Sentinel-1: Coherence/Interferometry

c2rcc

Sentinel-2: C2RCC

lai

Sentinel-2: SNAP-Biophysical

sen2cor

Sentinel-2: Sen2Cor

The Data Harvest processor lets you download commonly used datasets to your S3 object storage bucket.

For more detail please click here to go to the dedicated page:

Custom Hooks

You can create your own custom hooks to match your project requirements. The Custom Hooks page includes a complete starting example called hello-world-processor.

For more detail please click here to go to the dedicated page:

Jupyter Notebook examples

The collection of Jupyter Notebook examples on how to use the DestinE Data Lake services is available at Destination Earth on GitHub.

In the HOOK folder you can find two complete examples:

Tutorial

Demonstrates how to discover available workflows and run a workflow example (Data Harvest).

DEDL-Hook_access.ipynb

Shows how to retrieve an access token and list available workflows.

Next Steps

Start with Provided Hooks to run an existing workflow, then follow Custom Hooks to build your own.