{"_id":"5818fc6fbeb0c20f000d4471","parentDoc":null,"category":{"_id":"573b4f62ef164e2900a2b881","__v":0,"project":"55faeacad0e22017005b8265","version":"55faeacad0e22017005b8268","sync":{"url":"","isSync":false},"reference":false,"createdAt":"2016-05-17T17:05:38.443Z","from_sync":false,"order":8,"slug":"algorithm-guide","title":"Algorithms"},"user":"55fae9d4825d5f19001fa379","__v":0,"version":{"_id":"55faeacad0e22017005b8268","project":"55faeacad0e22017005b8265","__v":35,"createdAt":"2015-09-17T16:31:06.800Z","releaseDate":"2015-09-17T16:31:06.800Z","categories":["55faeacbd0e22017005b8269","55faf550764f50210095078e","55faf5b5626c341700fd9e96","55faf8a7825d5f19001fa386","560052f91503430d007cc88f","560054f73aa0520d00da0b1a","56005aaf6932a00d00ba7c62","56005c273aa0520d00da0b3f","5601ae7681a9670d006d164d","5601ae926811d00d00ceb487","5601aeb064866b1900f4768d","5601aee850ee460d0002224c","5601afa02499c119000faf19","5601afd381a9670d006d1652","561d4c78281aec0d00eb27b6","561d588d8ca8b90d00210219","563a5f934cc3621900ac278c","5665c5763889610d0008a29e","566710a36819320d000c2e93","56ddf6df8a5ae10e008e3926","56e1c96b2506700e00de6e83","56e1ccc4e416450e00b9e48c","56e1ccdfe63f910e00e59870","56e1cd10bc46be0e002af26a","56e1cd21e416450e00b9e48e","56e3139a51857d0e008e77be","573b4f62ef164e2900a2b881","57c9d1335fd8ca0e006308ed","57e2bd9d1e7b7220000d7fa5","57f2b992ac30911900c7c2b6","58adb5c275df0f1b001ed59b","58c81b5c6dc7140f003c3c46","595412446ed4d9001b3e7b37","59e76ce41938310028037295","5a009de510890d001c2aabfe"],"is_deprecated":false,"is_hidden":false,"is_beta":false,"is_stable":true,"codename":"v1","version_clean":"1.0.0","version":"1"},"project":"55faeacad0e22017005b8265","updates":[],"next":{"pages":[],"description":""},"createdAt":"2016-11-01T20:34:55.874Z","link_external":false,"link_url":"","githubsync":"","sync_unique":"","hidden":false,"api":{"results":{"codes":[]},"settings":"","auth":"required","params":[],"url":""},"isReference":false,"order":23,"body":"## Table of Contents\n\nSection | Description\n--- | ---\n[Imagery Examples](#Imagery Examples) | Before and after examples\n[Quickstart](#Quickstart) | Get started with a Python-based quickstart tutorial\n[Task Runtime](#Task Runtime) | Benchmark runtimes for the algorithm\n[Input Options](#Input Options) | Required and optional task inputs\n[Outputs](#Outputs) | Task outputs and example contents\n[Advanced Options](#Advanced Options) | Additional information for advanced users\n[Known Issues](#Known Issues) | Issues users should be aware of\n\n\n## <a name=\"Imagery Examples\"></a>Imagery Examples\n\n[block:image]\n{\n  \"images\": [\n    {\n      \"image\": [\n        \"https://files.readme.io/4ee937c-AOP_ImageR.jpg\",\n        \"AOP_ImageR.jpg\",\n        800,\n        320,\n        \"#453d30\"\n      ],\n      \"caption\": \"Before: Input image before ENVI ROI to Classification has been run\"\n    }\n  ]\n}\n[/block]\n\n[block:image]\n{\n  \"images\": [\n    {\n      \"image\": [\n        \"https://files.readme.io/602caad-ROIToClassLegendR.jpg\",\n        \"ROIToClassLegendR.jpg\",\n        800,\n        320,\n        \"#040409\"\n      ],\n      \"caption\": \"After: Same image with legend after ENVI ROI to Classification has been run\"\n    }\n  ]\n}\n[/block]\n\n## <a name=\"Quickstart\"></a>Quickstart Tutorial\nExample Script: Run in a python environment (i.e. - IPython) using the gbdxtools interface.\n\n[block:code]\n{\n  \"codes\": [\n    {\n      \"code\": \"from gbdxtools import Interface\\ngbdx = Interface()\\n\\n# Edit the following path to reflect a specific path to an image\\ninput_raster_data = 's3://gbd-customer-data/CustomerAccount#/PathToImage/'\\ninput_roi_data = 's3://gbd-customer-data/CustomerAccount#/PathToROIFile/'\\n\\nclasstask = gbdx.Task(\\\"ENVI_ROIToClassification\\\")\\nclasstask.inputs.input_raster = input_raster_data\\nclasstask.inputs.input_roi = input_roi_data\\n\\nworkflow = gbdx.Workflow([ classtask ])\\n\\nworkflow.savedata(\\n    classtask.outputs.output_raster_uri, \\n    location='ROIToClassification/output_raster_uri' # edit location to suit account\\n)\\n\\nprint workflow.execute()\\nprint workflow.status\\n# Repeat workflow.status as needed to monitor progress.\",\n      \"language\": \"python\"\n    }\n  ]\n}\n[/block]\n## <a name=\"Task Runtime\"></a>Task Runtime\nThese are the average runtimes for this algorithm. All benchmark tests were run using a standard set of images, based on our most common customer scenarios. Runtime benchmarks apply to the specific algorithm, and don’t represent the runtime of a complete workflow.\n\n  Sensor Name  |  Total Pixels  |  Total Area (k2)  |  Time(secs)  |  Time/Area k2\n--------|:----------:|-----------|----------------|---------------\nQB02 | 41,551,668 | 312.07 | 172.56 | 0.55 \nWV02|35,872,942 | 329.87 | 173.40 | 0.53 \nWV03|35,371,971 | 196.27 | 197.48 | 0.88 \nGE01| 57,498,000 | 332.97 | 184.30 | 0.55 \n\n\n\n## <a name=\"Input Options\"></a>Input Options\nThe following table lists all inputs for this task. For details regarding the use of all ENVI input types refer to the [ENVI Task Runner Inputs]([See ENVIRASTER input type](https://github.com/TDG-Platform/docs/blob/master/ENVI_Task_Runner_Inputs.md)) documentation.\n\n| Name                       | Required | Default |               Valid Values               | Description                              |\n| -------------------------- | :------: | :-----: | :--------------------------------------: | ---------------------------------------- |\n| input_raster               |   True   |  None   |  A valid S3 URL containing image files.  | Specify a raster from which to run the task. -- Value Type: ENVIRASTER |\n| input_raster_format        |  False   |  None   | [See ENVIRASTER input type](https://github.com/TDG-Platform/docs/blob/master/ENVI_Task_Runner_Inputs.md) | Provide the format of the image, for example: landsat-8. -- Value Type: STRING |\n| input_raster_band_grouping |  False   |  None   | [See ENVIRASTER input type](https://github.com/TDG-Platform/docs/blob/master/ENVI_Task_Runner_Inputs.md) | Provide the name of the band grouping to be used in the task, ie - panchromatic. -- Value Type: STRING |\n| input_raster_filename      |  False   |  None   | [See ENVIRASTER input type](https://github.com/TDG-Platform/docs/blob/master/ENVI_Task_Runner_Inputs.md) | Provide the explicit relative raster filename that ENVI will open. This overrides any file lookup in the task runner. -- Value Type: STRING |\n| input_roi                  |   True   |   N/A   |              A valid S3 URL              | Specify an ROI or array of ROIs used to create the classification image. -- Value Type: [ENVIROI](https://github.com/TDG-Platform/docs/blob/master/ENVI_Task_Runner_Inputs.md#enviroi)[*] |\n| output_raster_uri_filename |  False   |  None   |                  string                  | Specify a string with the fully-qualified path and filename for OUTPUT_RASTER. -- Value Type: STRING |\n\n\n\n## <a name=\"Outputs\"></a>Outputs\nThe following table lists all the outputs from this task.\n\n| Name              | Required | Description                              |\n| ----------------- | :------: | ---------------------------------------- |\n| output_raster_uri |   True   | Output for OUTPUT_RASTER.                |\n| task_meta_data    |  False   | GBDX Option. Output location for task meta data such as execution log and output JSON. |\n\n##### Output Structure\n\nThe output_raster image file will be written to the specified S3 Customer Account Location in GeoTiff (\\*.tif) format, with an ENVI header file (\\*.hdr).\n\n## <a name=\"Advanced Options\"></a>Advanced Options\n\nTo link the workflow of 1 input image into the Advanced Image Preprocessor and into the ENVI ROI To Classification task you must use the following gbdxtools script python example:\n[block:code]\n{\n  \"codes\": [\n    {\n      \"code\": \"from gbdxtools import Interface\\ngbdx = Interface()\\n\\n# Edit the following path to reflect a specific path to an image\\ndata = 's3://gbd-customer-data/CustomerAccount#/PathToImage/'\\n\\naoptask = gbdx.Task(\\\"AOP_Strip_Processor\\\") \\naoptask.inputs.data = data\\naoptask.inputs.enable_dra = False\\naoptask.inputs.bands = 'MS'\\n\\nthreshold = gbdx.Task(\\\"ENVI_ImageThresholdToROI\\\")\\nthreshold.inputs.input_raster = aoptask.outputs.data.value\\nthreshold.inputs.roi_name = \\\"[\\\\\\\"Water\\\\\\\", \\\\\\\"Land\\\\\\\"]\\\"\\nthreshold.inputs.roi_color = \\\"[[0,255,0],[0,0,255]]\\\"\\nthreshold.inputs.threshold = \\\"[[138,221,0],[222,306,0]]\\\"\\nthreshold.inputs.output_roi_uri_filename = \\\"roi\\\"\\n\\nroitoclass = gbdx.Task(\\\"ENVI_ROIToClassification\\\")\\nroitoclass.inputs.input_raster = aoptask.outputs.data.value\\nroitoclass.inputs.input_roi = threshold.outputs.output_roi_uri.value\\n\\nworkflow = gbdx.Workflow([ aoptask, threshold, roitoclass ])\\n\\nworkflow.savedata(\\n    roitoclass.outputs.output_raster_uri,\\n    location='ROIToClassification/output_raster_uri'\\n)\\n\\nprint workflow.execute()\\nprint workflow.status\\n# Repeat workflow.status as needed to monitor progress.\",\n      \"language\": \"python\"\n    }\n  ]\n}\n[/block]\n##<a name=\"Known Issues\"></a>Known Issues\nNone.\n\n## Background\nFor additional background information on this task please refer to the <a href=\"http://www.harrisgeospatial.com/docs/home.html\">Harris Geospatial ENVI documentation.</a>​\n\n#### Contact Us   \nIf you have any questions or issues with this task, please contact [**gbdx-support:::at:::digitalglobe.com** ](mailto:gbdx-support@digitalglobe.com).","excerpt":"This task creates a classification image from regions of interest (ROIs).  The input ROI file must be created using the [ENVI Image Threshold To ROI Task](http://gbdxdocs.digitalglobe.com/docs/envi-image-threshold-to-roi).  You may use a pre-existing ROI dataset or  produce the final classification as part of a larger workflow. \n\n**GBDX Registered Name:** ENVI_ROIToClassification\n**Provider:** Harris\tGeospatial Solutions\nFor more information on how to execute this task please refer to the [ENVI® Task Runner Inputs](doc:envi-task-runner-inputs) .","slug":"envi-roi-to-classification","type":"basic","title":"ENVI® ROI to Classification"}

ENVI® ROI to Classification

This task creates a classification image from regions of interest (ROIs). The input ROI file must be created using the [ENVI Image Threshold To ROI Task](http://gbdxdocs.digitalglobe.com/docs/envi-image-threshold-to-roi). You may use a pre-existing ROI dataset or produce the final classification as part of a larger workflow. **GBDX Registered Name:** ENVI_ROIToClassification **Provider:** Harris Geospatial Solutions For more information on how to execute this task please refer to the [ENVI® Task Runner Inputs](doc:envi-task-runner-inputs) .

## Table of Contents Section | Description --- | --- [Imagery Examples](#Imagery Examples) | Before and after examples [Quickstart](#Quickstart) | Get started with a Python-based quickstart tutorial [Task Runtime](#Task Runtime) | Benchmark runtimes for the algorithm [Input Options](#Input Options) | Required and optional task inputs [Outputs](#Outputs) | Task outputs and example contents [Advanced Options](#Advanced Options) | Additional information for advanced users [Known Issues](#Known Issues) | Issues users should be aware of ## <a name="Imagery Examples"></a>Imagery Examples [block:image] { "images": [ { "image": [ "https://files.readme.io/4ee937c-AOP_ImageR.jpg", "AOP_ImageR.jpg", 800, 320, "#453d30" ], "caption": "Before: Input image before ENVI ROI to Classification has been run" } ] } [/block] [block:image] { "images": [ { "image": [ "https://files.readme.io/602caad-ROIToClassLegendR.jpg", "ROIToClassLegendR.jpg", 800, 320, "#040409" ], "caption": "After: Same image with legend after ENVI ROI to Classification has been run" } ] } [/block] ## <a name="Quickstart"></a>Quickstart Tutorial Example Script: Run in a python environment (i.e. - IPython) using the gbdxtools interface. [block:code] { "codes": [ { "code": "from gbdxtools import Interface\ngbdx = Interface()\n\n# Edit the following path to reflect a specific path to an image\ninput_raster_data = 's3://gbd-customer-data/CustomerAccount#/PathToImage/'\ninput_roi_data = 's3://gbd-customer-data/CustomerAccount#/PathToROIFile/'\n\nclasstask = gbdx.Task(\"ENVI_ROIToClassification\")\nclasstask.inputs.input_raster = input_raster_data\nclasstask.inputs.input_roi = input_roi_data\n\nworkflow = gbdx.Workflow([ classtask ])\n\nworkflow.savedata(\n classtask.outputs.output_raster_uri, \n location='ROIToClassification/output_raster_uri' # edit location to suit account\n)\n\nprint workflow.execute()\nprint workflow.status\n# Repeat workflow.status as needed to monitor progress.", "language": "python" } ] } [/block] ## <a name="Task Runtime"></a>Task Runtime These are the average runtimes for this algorithm. All benchmark tests were run using a standard set of images, based on our most common customer scenarios. Runtime benchmarks apply to the specific algorithm, and don’t represent the runtime of a complete workflow. Sensor Name | Total Pixels | Total Area (k2) | Time(secs) | Time/Area k2 --------|:----------:|-----------|----------------|--------------- QB02 | 41,551,668 | 312.07 | 172.56 | 0.55 WV02|35,872,942 | 329.87 | 173.40 | 0.53 WV03|35,371,971 | 196.27 | 197.48 | 0.88 GE01| 57,498,000 | 332.97 | 184.30 | 0.55 ## <a name="Input Options"></a>Input Options The following table lists all inputs for this task. For details regarding the use of all ENVI input types refer to the [ENVI Task Runner Inputs]([See ENVIRASTER input type](https://github.com/TDG-Platform/docs/blob/master/ENVI_Task_Runner_Inputs.md)) documentation. | Name | Required | Default | Valid Values | Description | | -------------------------- | :------: | :-----: | :--------------------------------------: | ---------------------------------------- | | input_raster | True | None | A valid S3 URL containing image files. | Specify a raster from which to run the task. -- Value Type: ENVIRASTER | | input_raster_format | False | None | [See ENVIRASTER input type](https://github.com/TDG-Platform/docs/blob/master/ENVI_Task_Runner_Inputs.md) | Provide the format of the image, for example: landsat-8. -- Value Type: STRING | | input_raster_band_grouping | False | None | [See ENVIRASTER input type](https://github.com/TDG-Platform/docs/blob/master/ENVI_Task_Runner_Inputs.md) | Provide the name of the band grouping to be used in the task, ie - panchromatic. -- Value Type: STRING | | input_raster_filename | False | None | [See ENVIRASTER input type](https://github.com/TDG-Platform/docs/blob/master/ENVI_Task_Runner_Inputs.md) | Provide the explicit relative raster filename that ENVI will open. This overrides any file lookup in the task runner. -- Value Type: STRING | | input_roi | True | N/A | A valid S3 URL | Specify an ROI or array of ROIs used to create the classification image. -- Value Type: [ENVIROI](https://github.com/TDG-Platform/docs/blob/master/ENVI_Task_Runner_Inputs.md#enviroi)[*] | | output_raster_uri_filename | False | None | string | Specify a string with the fully-qualified path and filename for OUTPUT_RASTER. -- Value Type: STRING | ## <a name="Outputs"></a>Outputs The following table lists all the outputs from this task. | Name | Required | Description | | ----------------- | :------: | ---------------------------------------- | | output_raster_uri | True | Output for OUTPUT_RASTER. | | task_meta_data | False | GBDX Option. Output location for task meta data such as execution log and output JSON. | ##### Output Structure The output_raster image file will be written to the specified S3 Customer Account Location in GeoTiff (\*.tif) format, with an ENVI header file (\*.hdr). ## <a name="Advanced Options"></a>Advanced Options To link the workflow of 1 input image into the Advanced Image Preprocessor and into the ENVI ROI To Classification task you must use the following gbdxtools script python example: [block:code] { "codes": [ { "code": "from gbdxtools import Interface\ngbdx = Interface()\n\n# Edit the following path to reflect a specific path to an image\ndata = 's3://gbd-customer-data/CustomerAccount#/PathToImage/'\n\naoptask = gbdx.Task(\"AOP_Strip_Processor\") \naoptask.inputs.data = data\naoptask.inputs.enable_dra = False\naoptask.inputs.bands = 'MS'\n\nthreshold = gbdx.Task(\"ENVI_ImageThresholdToROI\")\nthreshold.inputs.input_raster = aoptask.outputs.data.value\nthreshold.inputs.roi_name = \"[\\\"Water\\\", \\\"Land\\\"]\"\nthreshold.inputs.roi_color = \"[[0,255,0],[0,0,255]]\"\nthreshold.inputs.threshold = \"[[138,221,0],[222,306,0]]\"\nthreshold.inputs.output_roi_uri_filename = \"roi\"\n\nroitoclass = gbdx.Task(\"ENVI_ROIToClassification\")\nroitoclass.inputs.input_raster = aoptask.outputs.data.value\nroitoclass.inputs.input_roi = threshold.outputs.output_roi_uri.value\n\nworkflow = gbdx.Workflow([ aoptask, threshold, roitoclass ])\n\nworkflow.savedata(\n roitoclass.outputs.output_raster_uri,\n location='ROIToClassification/output_raster_uri'\n)\n\nprint workflow.execute()\nprint workflow.status\n# Repeat workflow.status as needed to monitor progress.", "language": "python" } ] } [/block] ##<a name="Known Issues"></a>Known Issues None. ## Background For additional background information on this task please refer to the <a href="http://www.harrisgeospatial.com/docs/home.html">Harris Geospatial ENVI documentation.</a>​ #### Contact Us If you have any questions or issues with this task, please contact [**gbdx-support@digitalglobe.com** ](mailto:gbdx-support@digitalglobe.com).