{"_id":"5601bf6d7916060d00760b00","category":{"_id":"5601afa02499c119000faf19","pages":["5601b42b6811d00d00ceb49b","5601b63c9137690d003357cb","5601b7b49137690d003357d0","5601b82a81a9670d006d166b","5601b84350ee460d00022261","5601b8c281a9670d006d166e","5601b9146811d00d00ceb4a3","5601ba086811d00d00ceb4a4","5601baf650ee460d00022264","5601bf6d7916060d00760b00","561d578e9242920d00df9f47","561d57ec2d6a450d00f0512d","561d58449242920d00df9f53","56438be2f49bfa0d002f561e","56438e101ecf381700343c48","5643902708894c0d00031f14","564391d1c92c470d002deb03","564394529eebf70d00490d35","56f194243eb62a34003e9fef"],"project":"55faeacad0e22017005b8265","__v":19,"version":"55faeacad0e22017005b8268","sync":{"url":"","isSync":false},"reference":false,"createdAt":"2015-09-22T19:44:32.525Z","from_sync":false,"order":10,"slug":"s3-storage-service-guide","title":"S3 Storage Service Guide"},"version":{"_id":"55faeacad0e22017005b8268","project":"55faeacad0e22017005b8265","__v":35,"createdAt":"2015-09-17T16:31:06.800Z","releaseDate":"2015-09-17T16:31:06.800Z","categories":["55faeacbd0e22017005b8269","55faf550764f50210095078e","55faf5b5626c341700fd9e96","55faf8a7825d5f19001fa386","560052f91503430d007cc88f","560054f73aa0520d00da0b1a","56005aaf6932a00d00ba7c62","56005c273aa0520d00da0b3f","5601ae7681a9670d006d164d","5601ae926811d00d00ceb487","5601aeb064866b1900f4768d","5601aee850ee460d0002224c","5601afa02499c119000faf19","5601afd381a9670d006d1652","561d4c78281aec0d00eb27b6","561d588d8ca8b90d00210219","563a5f934cc3621900ac278c","5665c5763889610d0008a29e","566710a36819320d000c2e93","56ddf6df8a5ae10e008e3926","56e1c96b2506700e00de6e83","56e1ccc4e416450e00b9e48c","56e1ccdfe63f910e00e59870","56e1cd10bc46be0e002af26a","56e1cd21e416450e00b9e48e","56e3139a51857d0e008e77be","573b4f62ef164e2900a2b881","57c9d1335fd8ca0e006308ed","57e2bd9d1e7b7220000d7fa5","57f2b992ac30911900c7c2b6","58adb5c275df0f1b001ed59b","58c81b5c6dc7140f003c3c46","595412446ed4d9001b3e7b37","59e76ce41938310028037295","5a009de510890d001c2aabfe"],"is_deprecated":false,"is_hidden":false,"is_beta":false,"is_stable":true,"codename":"v1","version_clean":"1.0.0","version":"1"},"user":"55fae9d4825d5f19001fa379","parentDoc":null,"project":"55faeacad0e22017005b8265","__v":1,"updates":[],"next":{"pages":[],"description":""},"createdAt":"2015-09-22T20:51:57.778Z","link_external":false,"link_url":"","githubsync":"","sync_unique":"","hidden":false,"api":{"results":{"codes":[]},"settings":"","auth":"required","params":[],"url":""},"isReference":false,"order":10,"body":"[block:api-header]\n{\n  \"type\": \"basic\",\n  \"title\": \"Python Example #1\"\n}\n[/block]\nThis Python script covers the following steps:\n\n1. Get your OAuth access token\n2. Request temporary credentials from the GBDX S3 Storage Service\n3. Send temporary credentials to AWS CLI\n4. Execute CLI command\n[block:code]\n{\n  \"codes\": [\n    {\n      \"code\": \" First install the aws cli:\\n        pip install aws-cli\\n        \\n        \\n        Copy this code into  create_s3tmp_creds.py :\\n        \\n        import requests\\n        \\n        # Get nricklin user token\\n        username = '<username>'\\n        password = '<password>'\\n        api_key = '<api-key>'\\n        \\n        token_url = 'https://geobigdata.io/auth/v1/oauth/token/'\\n        \\n        headers = {'Authorization': \\\"Basic %s\\\" % api_key}\\n        body = {\\\"grant_type\\\":'password', 'username': username, 'password': password}\\n        \\n        r = requests.post(token_url, headers=headers, data=body)\\n        \\n        access_token = r.json()['access_token']\\n        \\n        url          = 'https://geobigdata.io/s3creds/v1/prefix?duration=36000'\\n        headers      = {'Content-Type': 'application/json',\\\"Authorization\\\": \\\"Bearer \\\" + access_token}\\n        results      = requests.get(url,headers=headers)\\n        \\n        # print out the json results\\n        print \\\"[DEFAULT]\\\"\\n        print \\\"aws_secret_access_key=\\\" + results.json()['S3_secret_key']\\n        print \\\"aws_access_key_id=\\\" + results.json()['S3_access_key']\\n        print \\\"aws_session_token=\\\" + results.json()['S3_session_token']\\n        \\n        Dump credentials into your aws cli credentials file:\\n        python create_s3tmp_creds.py > ~/.aws/credentials\\n        \\n        \\n        Finally, use the AWS CLI to list items in your storage location:\\n        aws s3 ls s3://gbd-customer-data/<your_prefix>/\",\n      \"language\": \"python\"\n    }\n  ]\n}\n[/block]\n\n[block:api-header]\n{\n  \"type\": \"basic\",\n  \"title\": \"Python Example #2\"\n}\n[/block]\nThis Python example assumes you already have a valid OAuth token. It competes the following steps:\n\n1. Request temporary credentials from the GBDX S3 Storage Service\n2. Send temporary credentials to AWS CLI\n3. Execute CLI command\n[block:code]\n{\n  \"codes\": [\n    {\n      \"code\": \"```python\\nimport requests\\nimport json\\nimport sys\\n\\naccess_token = \\\"Bearer \\\" + sys.argv[1]\\nurl = 'https://geobigdata.io/s3creds/v1/prefix?duration=3600'\\nheaders = {'Content-Type': 'application/json',\\\"Authorization\\\": access_token}\\nresults = requests.get(url,headers=headers)\\n                \\nprint\\nprint \\\"bucket/directory to send output data to: \\\" + \\\"s3://\\\" + results.json()['bucket'] + \\\"/\\\" + results.json()['prefix']\\nprint\\nprint \\\"### save the information below to a s3creds file for use with S3CMD\\\"\\nprint\\nprint \\\"[default]\\\"\\nprint \\\"secret_key = \\\" + results.json()['S3_secret_key']\\nprint \\\"access_key = \\\" + results.json()['S3_access_key']\\nprint \\\"access_token = \\\" + results.json()['S3_session_token']\\n#print \\\"session_token = \\\" + results.json()['S3_session_token']\\nprint\\n```\\n\\nFinally, use the AWS CLI to list items in your storage location:\\n        aws s3 ls s3://gbd-customer-data/<your_prefix>/\",\n      \"language\": \"python\"\n    }\n  ]\n}\n[/block]","excerpt":"Python example script for sending a request to the GBDX Storage service and accessing the contents of the S3 bucket.","slug":"python-examples","type":"basic","title":"Python Examples"}

Python Examples

Python example script for sending a request to the GBDX Storage service and accessing the contents of the S3 bucket.

[block:api-header] { "type": "basic", "title": "Python Example #1" } [/block] This Python script covers the following steps: 1. Get your OAuth access token 2. Request temporary credentials from the GBDX S3 Storage Service 3. Send temporary credentials to AWS CLI 4. Execute CLI command [block:code] { "codes": [ { "code": " First install the aws cli:\n pip install aws-cli\n \n \n Copy this code into create_s3tmp_creds.py :\n \n import requests\n \n # Get nricklin user token\n username = '<username>'\n password = '<password>'\n api_key = '<api-key>'\n \n token_url = 'https://geobigdata.io/auth/v1/oauth/token/'\n \n headers = {'Authorization': \"Basic %s\" % api_key}\n body = {\"grant_type\":'password', 'username': username, 'password': password}\n \n r = requests.post(token_url, headers=headers, data=body)\n \n access_token = r.json()['access_token']\n \n url = 'https://geobigdata.io/s3creds/v1/prefix?duration=36000'\n headers = {'Content-Type': 'application/json',\"Authorization\": \"Bearer \" + access_token}\n results = requests.get(url,headers=headers)\n \n # print out the json results\n print \"[DEFAULT]\"\n print \"aws_secret_access_key=\" + results.json()['S3_secret_key']\n print \"aws_access_key_id=\" + results.json()['S3_access_key']\n print \"aws_session_token=\" + results.json()['S3_session_token']\n \n Dump credentials into your aws cli credentials file:\n python create_s3tmp_creds.py > ~/.aws/credentials\n \n \n Finally, use the AWS CLI to list items in your storage location:\n aws s3 ls s3://gbd-customer-data/<your_prefix>/", "language": "python" } ] } [/block] [block:api-header] { "type": "basic", "title": "Python Example #2" } [/block] This Python example assumes you already have a valid OAuth token. It competes the following steps: 1. Request temporary credentials from the GBDX S3 Storage Service 2. Send temporary credentials to AWS CLI 3. Execute CLI command [block:code] { "codes": [ { "code": "```python\nimport requests\nimport json\nimport sys\n\naccess_token = \"Bearer \" + sys.argv[1]\nurl = 'https://geobigdata.io/s3creds/v1/prefix?duration=3600'\nheaders = {'Content-Type': 'application/json',\"Authorization\": access_token}\nresults = requests.get(url,headers=headers)\n \nprint\nprint \"bucket/directory to send output data to: \" + \"s3://\" + results.json()['bucket'] + \"/\" + results.json()['prefix']\nprint\nprint \"### save the information below to a s3creds file for use with S3CMD\"\nprint\nprint \"[default]\"\nprint \"secret_key = \" + results.json()['S3_secret_key']\nprint \"access_key = \" + results.json()['S3_access_key']\nprint \"access_token = \" + results.json()['S3_session_token']\n#print \"session_token = \" + results.json()['S3_session_token']\nprint\n```\n\nFinally, use the AWS CLI to list items in your storage location:\n aws s3 ls s3://gbd-customer-data/<your_prefix>/", "language": "python" } ] } [/block]