{"_id":"5824b9d561ad6b2d0030f9ba","parentDoc":null,"user":"55fae9d4825d5f19001fa379","__v":0,"category":{"_id":"573b4f62ef164e2900a2b881","__v":0,"project":"55faeacad0e22017005b8265","version":"55faeacad0e22017005b8268","sync":{"url":"","isSync":false},"reference":false,"createdAt":"2016-05-17T17:05:38.443Z","from_sync":false,"order":7,"slug":"algorithm-guide","title":"Certified Algorithms"},"project":"55faeacad0e22017005b8265","version":{"_id":"55faeacad0e22017005b8268","project":"55faeacad0e22017005b8265","__v":34,"createdAt":"2015-09-17T16:31:06.800Z","releaseDate":"2015-09-17T16:31:06.800Z","categories":["55faeacbd0e22017005b8269","55faf550764f50210095078e","55faf5b5626c341700fd9e96","55faf8a7825d5f19001fa386","560052f91503430d007cc88f","560054f73aa0520d00da0b1a","56005aaf6932a00d00ba7c62","56005c273aa0520d00da0b3f","5601ae7681a9670d006d164d","5601ae926811d00d00ceb487","5601aeb064866b1900f4768d","5601aee850ee460d0002224c","5601afa02499c119000faf19","5601afd381a9670d006d1652","561d4c78281aec0d00eb27b6","561d588d8ca8b90d00210219","563a5f934cc3621900ac278c","5665c5763889610d0008a29e","566710a36819320d000c2e93","56ddf6df8a5ae10e008e3926","56e1c96b2506700e00de6e83","56e1ccc4e416450e00b9e48c","56e1ccdfe63f910e00e59870","56e1cd10bc46be0e002af26a","56e1cd21e416450e00b9e48e","56e3139a51857d0e008e77be","573b4f62ef164e2900a2b881","57c9d1335fd8ca0e006308ed","57e2bd9d1e7b7220000d7fa5","57f2b992ac30911900c7c2b6","58adb5c275df0f1b001ed59b","58c81b5c6dc7140f003c3c46","595412446ed4d9001b3e7b37","59e76ce41938310028037295"],"is_deprecated":false,"is_hidden":false,"is_beta":false,"is_stable":true,"codename":"v1","version_clean":"1.0.0","version":"1"},"updates":[],"next":{"pages":[],"description":""},"createdAt":"2016-11-10T18:17:57.973Z","link_external":false,"link_url":"","githubsync":"","sync_unique":"","hidden":false,"api":{"results":{"codes":[]},"settings":"","auth":"required","params":[],"url":""},"isReference":false,"order":8,"body":"## Table of Contents\n\nSection | Description\n--- | ---\n[Imagery Examples](#Imagery Examples) | Before and after examples\n[Quickstart](#Quickstart) | Get started with a Python-based quickstart tutorial\n[Task Runtime](#Task Runtime) | Benchmark runtimes for the algorithm\n[Input Options](#Input Options) | Required and optional task inputs\n[Outputs](#Outputs) | Task outputs and example contents\n[Advanced Options](#Advanced Options) | Additional information for advanced users\n[Known Issues](#Known Issues) | Issues users should be aware of\n\n\n## <a name=\"Imagery Examples\"></a>Imagery Examples\nThis task automatically classifies change detection between two images, so two input rasters are required. \n[block:image]\n{\n  \"images\": [\n    {\n      \"image\": [\n        \"https://files.readme.io/261ce8d-ENVI_AutoChangeThresholdClassification_before.jpg\",\n        \"ENVI_AutoChangeThresholdClassification_before.jpg\",\n        800,\n        320,\n        \"#4f454c\"\n      ],\n      \"caption\": \"Before: First of two Input images that wil be compared when Auto Change Threshold Classification is run\"\n    }\n  ]\n}\n[/block]\n\n[block:image]\n{\n  \"images\": [\n    {\n      \"image\": [\n        \"https://files.readme.io/85fd970-ENVI_AutoChangeThresholdClassification_before_raster2.jpg\",\n        \"ENVI_AutoChangeThresholdClassification_before_raster2.jpg\",\n        800,\n        320,\n        \"#6a6a6c\"\n      ],\n      \"caption\": \"Before: Second of two Input images that wil be compared when Auto Change Threshold Classification is run\"\n    }\n  ]\n}\n[/block]\n\n[block:image]\n{\n  \"images\": [\n    {\n      \"image\": [\n        \"https://files.readme.io/87d85d2-ENVI_AutoChangeThresholdClassification_after.jpg\",\n        \"ENVI_AutoChangeThresholdClassification_after.jpg\",\n        800,\n        320,\n        \"#1f0404\"\n      ],\n      \"caption\": \"After: Output of ENVI Auto Change Threshold Classification is run\"\n    }\n  ]\n}\n[/block]\n\n## <a name=\"Quickstart\"></a>Quickstart Tutorial\n\nExample Script: Run in a python environment (i.e. - IPython) using the gbdxtools interface.\n[block:code]\n{\n  \"codes\": [\n    {\n      \"code\": \"from gbdxtools import Interface\\ngbdx = Interface()\\n\\n# Edit the following path to reflect a specific path to an image\\nNDVI1 = 's3://gbd-customer-data/CustomerAccount#/PathToImage1/'\\nNDVI2 = 's3://gbd-customer-data/CustomerAccount#/PathToImage2/'\\n\\nenvi_IBD = gbdx.Task(\\\"ENVI_ImageBandDifference\\\")\\nenvi_IBD.inputs.input_raster1 = NDVI1\\nenvi_IBD.inputs.input_raster2 = NDVI2\\n\\nenvi_ACTC = gbdx.Task(\\\"ENVI_AutoChangeThresholdClassification\\\")\\nenvi_ACTC.inputs.input_raster = envi_IBD.outputs.output_raster_uri.value\\n\\n\\nworkflow = gbdx.Workflow([envi_IBD, envi_ACTC])\\n\\nworkflow.savedata(\\n    envi_ACTC.outputs.output_raster_uri,\\n        location='AutoChangeThreshold/output_raster_uri'\\n)\\n\\nprint workflow.execute()\\nprint workflow.status\\n# Repeat workflow.status as needed to monitor progress.\",\n      \"language\": \"python\"\n    }\n  ]\n}\n[/block]\n## <a name=\"Task Runtime\"></a>Task Runtime\nThese are the average runtimes for this algorithm. All benchmark tests were run using a standard set of images, based on our most common customer scenarios. Runtime benchmarks apply to the specific algorithm, and don’t represent the runtime of a complete workflow.\n\n  Sensor Name  | Total Pixels |  Total Area (k2)  |  Time(secs)  |  Time/Area k2\n--------|:----------:|-----------|----------------|---------------\nWV02|73,005,420|292.02| 169.60| 0.58 \n\n\n## <a name=\"Input Options\"></a>Input Options\nThe following table lists all inputs for this task. For details regarding the use of all ENVI input types refer to the [ENVI Task Runner Inputs]([See ENVIRASTER input type](https://github.com/TDG-Platform/docs/blob/master/ENVI_Task_Runner_Inputs.md)) documentation.\n\n| Name                       | Required | Default |               Valid Values               | Description                              |\n| -------------------------- | :------: | :-----: | :--------------------------------------: | ---------------------------------------- |\n| input_raster               |   True   |  None   |  A valid S3 URL containing image files.  | Specify a raster from which to run the task. -- Value Type: ENVIRASTER |\n| input_raster_format        |  False   |  None   | [See ENVIRASTER input type](https://github.com/TDG-Platform/docs/blob/master/ENVI_Task_Runner_Inputs.md) | Provide the format of the image, for example: landsat-8. -- Value Type: STRING |\n| input_raster_band_grouping |  False   |  None   | [See ENVIRASTER input type](https://github.com/TDG-Platform/docs/blob/master/ENVI_Task_Runner_Inputs.md) | Provide the name of the band grouping to be used in the task, ie - panchromatic. -- Value Type: STRING |\n| input_raster_filename      |  False   |  None   | [See ENVIRASTER input type](https://github.com/TDG-Platform/docs/blob/master/ENVI_Task_Runner_Inputs.md) | Provide the explicit relative raster filename that ENVI will open. This overrides any file lookup in the task runner. -- Value Type: STRING |\n| change_type                |  False   | 'Both'  |      'Increase', 'Decrease', 'Both'      | The type of change to consider for change of interest -- Value Type: STRING |\n| threshold_method           |  False   | 'Otsu'  |    'Otsu', 'Tsai', 'Kapur', 'Kittler'    | Specify the thresholding method. -- Value Type: STRING |\n| output_raster_uri_filename |  False   |  None   |                  string                  | Specify a string with the fully-qualified path and filename for OUTPUT_RASTER. -- Value Type: STRING |\n\n**Description of Auto Threshold Methods**\n\n| Name    | Description                              |\n| --- | --- |\n| Otsu    | A histogram shape-based method that is based on discriminate analysis. It uses the zero- and first-order cumulative moments of the histogram for calculating the value of the thresholding level. |\n| Tsai    | A moment-based method. It determines the threshold so that the first three moments of the input image are preserved in the output image. |\n| Kapur   | An entropy-based method. It considers the thresholding image as two classes of events, with each class characterized by a Probability Density Function (PDF). The method then maximizes the sum of the entropy of the two PDFs to converge on a single threshold value. |\n| Kittler | A histogram shape-based method. It approximates the histogram as a bimodal Gaussian distribution and finds a cutoff point. The cost function is based on the Bayes classification rule. \n\n## <a name=\"Outputs\"></a>Outputs\nThe following table lists all outputs from this task.\n\n| Name              | Required | Description                              |\n| ----------------- | :------: | ---------------------------------------- |\n| output_raster_uri |   True   | Output for OUTPUT_RASTER.                |\n| task_meta_data    |  False   | GBDX Option. Output location for task meta data such as execution log and output JSON. |\n\n##### Output Structure\n\nThe output_raster image file will be written to the specified S3 Customer Account Location in GeoTiff (\\*.tif) format, with an ENVI header file (\\*.hdr).\n\n## <a name=\"Advanced Options\"></a>Advanced Options\nThis task will take two multispectral images, which share geo-spatial extent, as input.  This example workflow includes the following ENVI tasks to prepare the images for the Change Threshold Classification task: Spectral Index, Image Intersection, and Image Band Difference.  Input rasters for the Change Threshold Classification task may be any  set of 1 band rasters sharing the same extent, spatial reference and pixel value format (e.g. Normalized Difference Vegetation Index).\n[block:code]\n{\n  \"codes\": [\n    {\n      \"code\": \"from gbdxtools import Interface\\ngbdx = Interface()\\n\\n# Edit the following path to reflect a specific path to an image\\ndata1 = 's3://gbd-customer-data/CustomerAccount#/PathToImage1/'\\ndata2 = 's3://gbd-customer-data/CustomerAccount#/PathToImage2/'\\n\\n\\naoptask1 = gbdx.Task(\\\"AOP_Strip_Processor\\\") \\naoptask1.inputs.data = data\\naoptask1.inputs.enable_dra = False\\naoptask1.inputs.bands = 'MS'\\n\\naoptask2 = gbdx.Task(\\\"AOP_Strip_Processor\\\") \\naoptask2.inputs.data = data\\naoptask2.inputs.enable_dra = False\\naoptask2.inputs.bands = 'MS'\\n\\n\\nenvi_ndvi1 = gbdx.Task(\\\"ENVI_SpectralIndex\\\")\\nenvi_ndvi1.inputs.input_raster = aoptask1.outputs.data.value\\nenvi_ndvi1.inputs.index = \\\"Normalized Difference Vegetation Index\\\"\\n\\nenvi_ndvi2 = gbdx.Task(\\\"ENVI_SpectralIndex\\\")\\nenvi_ndvi2.inputs.input_raster = aoptask2.outputs.data.value\\nenvi_ndvi2.inputs.index = \\\"Normalized Difference Vegetation Index\\\"\\n\\nenvi_II = gbdx.Task(\\\"ENVI_ImageIntersection\\\")\\nenvi_II.inputs.input_raster1 = envi_ndvi1.outputs.output_raster_uri.value\\nenvi_II.inputs.input_raster2 = envi_ndvi2.outputs.output_raster_uri.value\\nenvi_II.inputs.output_raster1_uri_filename = \\\"NDVI1\\\"\\nenvi_II.inputs.output_raster2_uri_filename = \\\"NDVI2\\\"\\n\\nenvi_IBD = gbdx.Task(\\\"ENVI_ImageBandDifference\\\")\\nenvi_IBD.inputs.input_raster1 = envi_II.outputs.output_raster1_uri.value\\nenvi_IBD.inputs.input_raster2 = envi_II.outputs.output_raster2_uri.value\\n\\nenvi_ACTC = gbdx.Task(\\\"ENVI_AutoChangeThresholdClassification\\\")\\nenvi_ACTC.inputs.threshold_method = \\\"Kapur\\\"\\nenvi_ACTC.inputs.input_raster = envi_IBD.outputs.output_raster_uri.value\\n\\nworkflow = gbdx.Workflow([\\n    aoptask1, aoptask2, envi_ndvi1, envi_ndvi2, envi_II, envi_IBD, envi_ACTC\\n])\\n\\nworkflow.savedata(\\n    envi_ACTC.outputs.output_raster_uri,\\n        location='AutoChangeThreshold/output_raster_uri' # edit location to suit account\\n)\\n\\nprint workflow.execute()\\nprint workflow.status\\n# Repeat workflow.status as needed to monitor progress.\",\n      \"language\": \"python\"\n    }\n  ]\n}\n[/block]\n##<a name=\"Known Issues\"></a>Known Issues\nInput rasters for the ENVI_AutoChangeThresholdClassification task will require pre-processing to fit specific input requirements.  Auto threshold may also require experimentation and iteration to find an acceptable threshold method (e.g. Kapur may not be the ideal method for a given raster dataset).\n\n\n## Background\nFor additional background information on this task please refer to the <a href=\"http://www.harrisgeospatial.com/docs/home.html\">Harris Geospatial ENVI documentation.</a>\n\n\n#### Contact Us   \nIf you have any questions or issues with this task, please contact [**gbdx-support:::at:::digitalglobe.com** ](mailto:gbdx-support@digitalglobe.com).","excerpt":"This task uses pre-defined thresholding techniques to automatically classify change detection between two images.\n\t\n**GBDX Registered Name**: ENVI_AutoChangeThresholdClassification\n**Provider**: Harris Geospatial Solutions\nFor more information on how to execute this task please refer to the [ENVI® Task Runner Inputs](doc:envi-task-runner-inputs) .","slug":"envi-auto-change-threshold-classification","type":"basic","title":"ENVI® Auto Change Threshold Classification"}

ENVI® Auto Change Threshold Classification

This task uses pre-defined thresholding techniques to automatically classify change detection between two images. **GBDX Registered Name**: ENVI_AutoChangeThresholdClassification **Provider**: Harris Geospatial Solutions For more information on how to execute this task please refer to the [ENVI® Task Runner Inputs](doc:envi-task-runner-inputs) .

## Table of Contents Section | Description --- | --- [Imagery Examples](#Imagery Examples) | Before and after examples [Quickstart](#Quickstart) | Get started with a Python-based quickstart tutorial [Task Runtime](#Task Runtime) | Benchmark runtimes for the algorithm [Input Options](#Input Options) | Required and optional task inputs [Outputs](#Outputs) | Task outputs and example contents [Advanced Options](#Advanced Options) | Additional information for advanced users [Known Issues](#Known Issues) | Issues users should be aware of ## <a name="Imagery Examples"></a>Imagery Examples This task automatically classifies change detection between two images, so two input rasters are required. [block:image] { "images": [ { "image": [ "https://files.readme.io/261ce8d-ENVI_AutoChangeThresholdClassification_before.jpg", "ENVI_AutoChangeThresholdClassification_before.jpg", 800, 320, "#4f454c" ], "caption": "Before: First of two Input images that wil be compared when Auto Change Threshold Classification is run" } ] } [/block] [block:image] { "images": [ { "image": [ "https://files.readme.io/85fd970-ENVI_AutoChangeThresholdClassification_before_raster2.jpg", "ENVI_AutoChangeThresholdClassification_before_raster2.jpg", 800, 320, "#6a6a6c" ], "caption": "Before: Second of two Input images that wil be compared when Auto Change Threshold Classification is run" } ] } [/block] [block:image] { "images": [ { "image": [ "https://files.readme.io/87d85d2-ENVI_AutoChangeThresholdClassification_after.jpg", "ENVI_AutoChangeThresholdClassification_after.jpg", 800, 320, "#1f0404" ], "caption": "After: Output of ENVI Auto Change Threshold Classification is run" } ] } [/block] ## <a name="Quickstart"></a>Quickstart Tutorial Example Script: Run in a python environment (i.e. - IPython) using the gbdxtools interface. [block:code] { "codes": [ { "code": "from gbdxtools import Interface\ngbdx = Interface()\n\n# Edit the following path to reflect a specific path to an image\nNDVI1 = 's3://gbd-customer-data/CustomerAccount#/PathToImage1/'\nNDVI2 = 's3://gbd-customer-data/CustomerAccount#/PathToImage2/'\n\nenvi_IBD = gbdx.Task(\"ENVI_ImageBandDifference\")\nenvi_IBD.inputs.input_raster1 = NDVI1\nenvi_IBD.inputs.input_raster2 = NDVI2\n\nenvi_ACTC = gbdx.Task(\"ENVI_AutoChangeThresholdClassification\")\nenvi_ACTC.inputs.input_raster = envi_IBD.outputs.output_raster_uri.value\n\n\nworkflow = gbdx.Workflow([envi_IBD, envi_ACTC])\n\nworkflow.savedata(\n envi_ACTC.outputs.output_raster_uri,\n location='AutoChangeThreshold/output_raster_uri'\n)\n\nprint workflow.execute()\nprint workflow.status\n# Repeat workflow.status as needed to monitor progress.", "language": "python" } ] } [/block] ## <a name="Task Runtime"></a>Task Runtime These are the average runtimes for this algorithm. All benchmark tests were run using a standard set of images, based on our most common customer scenarios. Runtime benchmarks apply to the specific algorithm, and don’t represent the runtime of a complete workflow. Sensor Name | Total Pixels | Total Area (k2) | Time(secs) | Time/Area k2 --------|:----------:|-----------|----------------|--------------- WV02|73,005,420|292.02| 169.60| 0.58 ## <a name="Input Options"></a>Input Options The following table lists all inputs for this task. For details regarding the use of all ENVI input types refer to the [ENVI Task Runner Inputs]([See ENVIRASTER input type](https://github.com/TDG-Platform/docs/blob/master/ENVI_Task_Runner_Inputs.md)) documentation. | Name | Required | Default | Valid Values | Description | | -------------------------- | :------: | :-----: | :--------------------------------------: | ---------------------------------------- | | input_raster | True | None | A valid S3 URL containing image files. | Specify a raster from which to run the task. -- Value Type: ENVIRASTER | | input_raster_format | False | None | [See ENVIRASTER input type](https://github.com/TDG-Platform/docs/blob/master/ENVI_Task_Runner_Inputs.md) | Provide the format of the image, for example: landsat-8. -- Value Type: STRING | | input_raster_band_grouping | False | None | [See ENVIRASTER input type](https://github.com/TDG-Platform/docs/blob/master/ENVI_Task_Runner_Inputs.md) | Provide the name of the band grouping to be used in the task, ie - panchromatic. -- Value Type: STRING | | input_raster_filename | False | None | [See ENVIRASTER input type](https://github.com/TDG-Platform/docs/blob/master/ENVI_Task_Runner_Inputs.md) | Provide the explicit relative raster filename that ENVI will open. This overrides any file lookup in the task runner. -- Value Type: STRING | | change_type | False | 'Both' | 'Increase', 'Decrease', 'Both' | The type of change to consider for change of interest -- Value Type: STRING | | threshold_method | False | 'Otsu' | 'Otsu', 'Tsai', 'Kapur', 'Kittler' | Specify the thresholding method. -- Value Type: STRING | | output_raster_uri_filename | False | None | string | Specify a string with the fully-qualified path and filename for OUTPUT_RASTER. -- Value Type: STRING | **Description of Auto Threshold Methods** | Name | Description | | --- | --- | | Otsu | A histogram shape-based method that is based on discriminate analysis. It uses the zero- and first-order cumulative moments of the histogram for calculating the value of the thresholding level. | | Tsai | A moment-based method. It determines the threshold so that the first three moments of the input image are preserved in the output image. | | Kapur | An entropy-based method. It considers the thresholding image as two classes of events, with each class characterized by a Probability Density Function (PDF). The method then maximizes the sum of the entropy of the two PDFs to converge on a single threshold value. | | Kittler | A histogram shape-based method. It approximates the histogram as a bimodal Gaussian distribution and finds a cutoff point. The cost function is based on the Bayes classification rule. ## <a name="Outputs"></a>Outputs The following table lists all outputs from this task. | Name | Required | Description | | ----------------- | :------: | ---------------------------------------- | | output_raster_uri | True | Output for OUTPUT_RASTER. | | task_meta_data | False | GBDX Option. Output location for task meta data such as execution log and output JSON. | ##### Output Structure The output_raster image file will be written to the specified S3 Customer Account Location in GeoTiff (\*.tif) format, with an ENVI header file (\*.hdr). ## <a name="Advanced Options"></a>Advanced Options This task will take two multispectral images, which share geo-spatial extent, as input. This example workflow includes the following ENVI tasks to prepare the images for the Change Threshold Classification task: Spectral Index, Image Intersection, and Image Band Difference. Input rasters for the Change Threshold Classification task may be any set of 1 band rasters sharing the same extent, spatial reference and pixel value format (e.g. Normalized Difference Vegetation Index). [block:code] { "codes": [ { "code": "from gbdxtools import Interface\ngbdx = Interface()\n\n# Edit the following path to reflect a specific path to an image\ndata1 = 's3://gbd-customer-data/CustomerAccount#/PathToImage1/'\ndata2 = 's3://gbd-customer-data/CustomerAccount#/PathToImage2/'\n\n\naoptask1 = gbdx.Task(\"AOP_Strip_Processor\") \naoptask1.inputs.data = data\naoptask1.inputs.enable_dra = False\naoptask1.inputs.bands = 'MS'\n\naoptask2 = gbdx.Task(\"AOP_Strip_Processor\") \naoptask2.inputs.data = data\naoptask2.inputs.enable_dra = False\naoptask2.inputs.bands = 'MS'\n\n\nenvi_ndvi1 = gbdx.Task(\"ENVI_SpectralIndex\")\nenvi_ndvi1.inputs.input_raster = aoptask1.outputs.data.value\nenvi_ndvi1.inputs.index = \"Normalized Difference Vegetation Index\"\n\nenvi_ndvi2 = gbdx.Task(\"ENVI_SpectralIndex\")\nenvi_ndvi2.inputs.input_raster = aoptask2.outputs.data.value\nenvi_ndvi2.inputs.index = \"Normalized Difference Vegetation Index\"\n\nenvi_II = gbdx.Task(\"ENVI_ImageIntersection\")\nenvi_II.inputs.input_raster1 = envi_ndvi1.outputs.output_raster_uri.value\nenvi_II.inputs.input_raster2 = envi_ndvi2.outputs.output_raster_uri.value\nenvi_II.inputs.output_raster1_uri_filename = \"NDVI1\"\nenvi_II.inputs.output_raster2_uri_filename = \"NDVI2\"\n\nenvi_IBD = gbdx.Task(\"ENVI_ImageBandDifference\")\nenvi_IBD.inputs.input_raster1 = envi_II.outputs.output_raster1_uri.value\nenvi_IBD.inputs.input_raster2 = envi_II.outputs.output_raster2_uri.value\n\nenvi_ACTC = gbdx.Task(\"ENVI_AutoChangeThresholdClassification\")\nenvi_ACTC.inputs.threshold_method = \"Kapur\"\nenvi_ACTC.inputs.input_raster = envi_IBD.outputs.output_raster_uri.value\n\nworkflow = gbdx.Workflow([\n aoptask1, aoptask2, envi_ndvi1, envi_ndvi2, envi_II, envi_IBD, envi_ACTC\n])\n\nworkflow.savedata(\n envi_ACTC.outputs.output_raster_uri,\n location='AutoChangeThreshold/output_raster_uri' # edit location to suit account\n)\n\nprint workflow.execute()\nprint workflow.status\n# Repeat workflow.status as needed to monitor progress.", "language": "python" } ] } [/block] ##<a name="Known Issues"></a>Known Issues Input rasters for the ENVI_AutoChangeThresholdClassification task will require pre-processing to fit specific input requirements. Auto threshold may also require experimentation and iteration to find an acceptable threshold method (e.g. Kapur may not be the ideal method for a given raster dataset). ## Background For additional background information on this task please refer to the <a href="http://www.harrisgeospatial.com/docs/home.html">Harris Geospatial ENVI documentation.</a> #### Contact Us If you have any questions or issues with this task, please contact [**gbdx-support@digitalglobe.com** ](mailto:gbdx-support@digitalglobe.com).