Python Examples

Python example script for sending a request to the GBDX Storage service and accessing the contents of the S3 bucket.

Python Example #1

This Python script covers the following steps:

  1. Get your OAuth access token
  2. Request temporary credentials from the GBDX S3 Storage Service
  3. Send temporary credentials to AWS CLI
  4. Execute CLI command
First install the aws cli:
        pip install aws-cli
        
        
        Copy this code into  create_s3tmp_creds.py :
        
        import requests
        
        # Get nricklin user token
        username = '<username>'
        password = '<password>'
        api_key = '<api-key>'
        
        token_url = 'https://geobigdata.io/auth/v1/oauth/token/'
        
        headers = {'Authorization': "Basic %s" % api_key}
        body = {"grant_type":'password', 'username': username, 'password': password}
        
        r = requests.post(token_url, headers=headers, data=body)
        
        access_token = r.json()['access_token']
        
        url          = 'https://geobigdata.io/s3creds/v1/prefix?duration=36000'
        headers      = {'Content-Type': 'application/json',"Authorization": "Bearer " + access_token}
        results      = requests.get(url,headers=headers)
        
        # print out the json results
        print "[DEFAULT]"
        print "aws_secret_access_key=" + results.json()['S3_secret_key']
        print "aws_access_key_id=" + results.json()['S3_access_key']
        print "aws_session_token=" + results.json()['S3_session_token']
        
        Dump credentials into your aws cli credentials file:
        python create_s3tmp_creds.py > ~/.aws/credentials
        
        
        Finally, use the AWS CLI to list items in your storage location:
        aws s3 ls s3://gbd-customer-data/<your_prefix>/

Python Example #2

This Python example assumes you already have a valid OAuth token. It competes the following steps:

  1. Request temporary credentials from the GBDX S3 Storage Service
  2. Send temporary credentials to AWS CLI
  3. Execute CLI command
```python
import requests
import json
import sys

access_token = "Bearer " + sys.argv[1]
url = 'https://geobigdata.io/s3creds/v1/prefix?duration=3600'
headers = {'Content-Type': 'application/json',"Authorization": access_token}
results = requests.get(url,headers=headers)
                
print
print "bucket/directory to send output data to: " + "s3://" + results.json()['bucket'] + "/" + results.json()['prefix']
print
print "### save the information below to a s3creds file for use with S3CMD"
print
print "[default]"
print "secret_key = " + results.json()['S3_secret_key']
print "access_key = " + results.json()['S3_access_key']
print "access_token = " + results.json()['S3_session_token']
#print "session_token = " + results.json()['S3_session_token']
print
```

Finally, use the AWS CLI to list items in your storage location:
        aws s3 ls s3://gbd-customer-data/<your_prefix>/