-
Notifications
You must be signed in to change notification settings - Fork 1
Harvester Processor
The "Harvester Processor" module simplifies the often complex process of searching for Sentinel satellite metadata. Recognizing the varied area coverage, download times, archives, and other differences across data hubs, we have chosen to integrate with the Copernicus dataspace for data retrieval. This decision ensures users have access to a comprehensive and reliable source for Sentinel data. For more sophisticated searching, check the Umbrella Sentinel Access Point application
- Comprehensive Search: Leverages the extensive satellite data archives available through the Copernicus dataspace, offering users a wide range of imagery options.
- Customized Queries: Users can specify their search criteria based on bounding box (bbox), sensing start date, and sensing end date, among other parameters, allowing for highly targeted data retrieval.
- Ease of Use: Through integration with Docker, users can run the Harvester Processor with minimal setup, directly passing their search parameters as arguments to the Docker container.
- Copernicus Registration: Users must be registered with the Copernicus Dataspace Eco System to access and download the data. Registration is free and can be completed on the Copernicus Dataspace Eco System.
- Docker Installation: Docker must be installed on your machine.
Download the git repository and use the existing docker image: *Open your terminal. *Navigate to the directory where you want to clone the repository. *Run the following command:
git clone https://github.com/Agri-Hub/eoProcessors.git
This will create a directory named eoProcessors in your current directory, containing the repository's contents.
To build the Docker image from source, clone this repository and navigate to the directory containing the Dockerfile:
git clone https://github.com/Agri-Hub/eoProcessors/harvester_processor.git
cd harvester_processor
docker build -t harvester_processor .
To use the Harvested Processor, users should first ensure they have Docker installed and configured on their machine. Then, they can pull and run the Docker image of the EO Processors, specifying their search criteria through command-line arguments based on the Sentinel mission:
docker run --rm -v /path/to/local/data:/app/data \
-e username='YOUR_EMAIL' \
-e password='YOUR_PASSWORD' \
-e startDate='YYYY-MM-DD' \
-e endDate='YYYY-MM-DD' \
-e bbox='MIN_LON,MIN_LAT,MAX_LON,MAX_LAT' \
-e cloudCover=MAX_CLOUD_COVER \
-e level=PROCESSING_LEVEL \
harvester_processor
Parameters
- /path/to/local/data: Path on your local machine where downloaded data will be stored.
- username: Your username for accessing Copernicus Dataspace Eco System.
- password: Your password for accessing Copernicus Dataspace Eco System.
- startDate & endDate: Date range for the imagery search (format: YYYY-MM-DD).
- bbox or tile: bbox for Geographical bounding box for the search area (format: MIN_LON,MIN_LAT,MAX_LON,MAX_LAT) and tile for the tile name of the product e.g. 34uFG
- cloudCover: Maximum acceptable cloud cover percentage for the images (0-100).
- level: Processing level of the Sentinel-2 data (1 for Level1C, 2 for Level2A, or 12 for both).
*APIs can change over time, with endpoints being deprecated or modified. This requires maintaining your codebase regularly to accommodate such changes, which might involve rewriting parts of your application to match the new API specifications. *Deleted or/and offline products due to deletion policy and archiving restrictions poses limitations by potentially removing access to historical data critical for longitudinal analysis or compliance purposes.