Prerequisites
You must have AWS CLI installed.
See Installing the AWS CLI
You need a GBDX access token to make API requests.
See Authentication Course
Step 1: Request Temporary Credentials to the GBDX S3 location
Make a GET
request to this API endpoint:
https://geobigdata.io/s3creds/v1/prefix
The default duration for the temporary credentials is set to 3600 seconds (one hour). To change the duration of the temporary credentials, append prefix?duration= to the request.
Make a GET
request to this API endpoint with the duration specified:
https://geobigdata.io/s3creds/v1/prefix?duration=129600
Duration Type | Value |
---|---|
Minimum | 900 seconds (.25 hours) |
Default | 3600 seconds (1 hour) |
Maximum | 129600 seconds (36 hours) |
Response Example
This is an example response with temporary credentials.
JSON
{
"S3_secret_key": "5ZzRIJzfsevuO/tN7o5SuEM6bpHMJdTBCFIzdH4u",
"prefix": "232875da-2659-42de-ac96-03e6e7918fc8",
"bucket": "gbd-bucket-name",
"S3_access_key": "FXILRTWOPRSR34Y5IQNJ",
"S3_session_token":
"FQoGZXIvYXdzEAYaDCD44hB5RaPHzVF7kiKTA8sESsZKd1kPjAwrQa0AGDiVRv4W8sRFPUfWfi9oqrvYCV29GGizqkdoQTu6TInRK/3fp4KQ8FsjJyVQ8hc0yL2778T2fK/zy8HofnusN267rYMdUbuXTSQml/LS31KiLrwmul7satKqg8Ji0ap+9sJb1nzVBun5sTiyrwXwW05RJmO/63dwwedI9bQ7BY+usDBQ7tZBVe5QlvbSYe1X0oHaUcIIxzL2vw4MtLgOVemOUlFqdKM+nA841K8mvHYkiIQHOL0AGJUt/JgZ6CmWiFLlEeYOFXeZgjkWarKWrMpeKmP9473J7LL8YS8gnvUPdBGxG9Pq7OIVk8yby6brhp1jmELFJlnbtQjwoCl+FrSd3vLRE552cBbKVSBDtaqMb7fqQRpGW9Dms0CIRAjLLTATX+S/RNH2j4QPf8KS2/ZD1xNeHwQAB+GyQTp+fyUMMfy8yzIEoXUURWuSKzg9Mi+QUZh0k+j0Wv2XD6gh0nzMic6dUmyRjdw8q57xQQqwjFgrKcRix6fCNke0+iRePkjhgoso2ISP5QU="
}
Note: These are examples only, and are not valid credentials.
The GBDX S3 Storage Service can be used to request temporary credentials to a specific folder or object. See S3 Storage Service API Requests for a complete list of API requests.
Step 2. Export temporary credentials
The AWS CLI requires the following connection string values:
AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY
AWS_SESSION_TOKEN
The API request returns these values, but uses a different naming convention. This table shows how the AWS names map to the GBDX names. The examples also help explain this.
AWS Name | GBDX Name |
---|---|
AWS_ACCESS_KEY_ID | S3_access_key |
AWS_SECRET_ACCESS_KEY | S3_secret_key |
AWS_SESSION_TOKEN | S3_session_token |
LINUX USERS To export on Linux, run the following commands to set the values:
export AWS_ACCESS_KEY_ID=[YOUR_S3_access_key]
export AWS_SECRET_ACCESS_KEY=[YOUR S3_secret_key]
export AWS_SESSION_TOKEN=[YOUR_S3_session_token]
"Export" Example:
This example uses the credentials from the example response provided in step 2.
export AWS_ACCESS_KEY_ID=ASIAIE3GWTJ7LLJLMCYB
export AWS_SECRET_ACCESS_KEY=JjnAy0NdO2WO5N7JgGRTuCoy3zqQdnSA4KLD9ogb
export AWS_SESSION_TOKEN=AQoDYXdzEBoa4ANP+zSjA4Pi6KLXiqw91T6oYhaJCMFmdtLcVcQcrGd1aIcy/3J8ZLfDmIkzsDWJhL8TvPvASaqxt/xYj8+SmlGNgnGH1jpSNwsDzCqTItlm6N5y8BZjCgSj3EKoyWW7XbTAAn+evMfMQEPlZM6onEdsYsm0CVx0DY8JnvTJBhA7I06/3g8XmSqOTxOfpqsYK5jt1JxseG956UOAWD35k34/r2BSQ+GKPpQ/drlcfPlQR/lDBopi8VejFh0Wq0GRUHg+yEJvZ1Ytrtm8R1MdMasXb3jVtMxm4SNH5/dVEP61yq9cA5B9UIl2LoFJYGx+fSnwRVaC0/1NjJzRNJmsR48Kyfaop1FNsKuCXWnGg1LWktnJRZft3vs+eaXQ2rvscex9cwxxg0Er9I9B1F0qD9ucHyrpxgRetMMymp3omIHMB3wcI+QCx39MKkBDCdpXNE3fCd0TaCbXX48XbJVaACCp60aNfvtkt7nRkyDsTx/gQ6GUpPiONxX8BKYLbsg6yvcXCyy6umAZBcOq+dYWxm5MSvIjJHFHbgS1+6xaJTkyAmtXRcJHzWwaUTmDe9Fh/qXA8aVu9NW2hf/aok61HqZqothqIEQeox7wI+21spXAh+uT+kT2YIDsnxRxR1GpMXkgzPLFtwU=
WINDOWS USERS To export on Windows, run the following commands to set the values.
SET AWS_ACCESS_KEY_ID=[YOUR_S3_access_key]
SET AWS_SECRET_ACCESS_KEY=[YOUR_S3_secret_key]
SET AWS_SESSION_TOKEN=[YOUR_S3_session_token]
"SET" Example:
SET AWS_ACCESS_KEY_ID=ASIAIE3GWTJ7LLJLMCYB
SET AWS_SECRET_ACCESS_KEY=JjnAy0NdO2WO5N7JgGRTuCoy3zqQdnSA4KLD9ogb
SET AWS_SESSION_TOKEN=AQoDYXdzEBoa4ANP+zSjA4Pi6KLXiqw91T6oYhaJCMFmdtLcVcQcrGd1aIcy/3J8ZLfDmIkzsDWJhL8TvPvASaqxt/xYj8+SmlGNgnGH1jpSNwsDzCqTItlm6N5y8BZjCgSj3EKoyWW7XbTAAn+evMfMQEPlZM6onEdsYsm0CVx0DY8JnvTJBhA7I06/3g8XmSqOTxOfpqsYK5jt1JxseG956UOAWD35k34/r2BSQ+GKPpQ/drlcfPlQR/lDBopi8VejFh0Wq0GRUHg+yEJvZ1Ytrtm8R1MdMasXb3jVtMxm4SNH5/dVEP61yq9cA5B9UIl2LoFJYGx+fSnwRVaC0/1NjJzRNJmsR48Kyfaop1FNsKuCXWnGg1LWktnJRZft3vs+eaXQ2rvscex9cwxxg0Er9I9B1F0qD9ucHyrpxgRetMMymp3omIHMB3wcI+QCx39MKkBDCdpXNE3fCd0TaCbXX48XbJVaACCp60aNfvtkt7nRkyDsTx/gQ6GUpPiONxX8BKYLbsg6yvcXCyy6umAZBcOq+dYWxm5MSvIjJHFHbgS1+6xaJTkyAmtXRcJHzWwaUTmDe9Fh/qXA8aVu9NW2hf/aok61HqZqothqIEQeox7wI+21spXAh+uT+kT2YIDsnxRxR1GpMXkgzPLFtwU=
S3cmd can also be used instead of AWS CLI.
Note: S3cmd can also be used to access data in your S3 Bucket. To set up and use S3cmd with Linux, see http://s3tools.org/s3cmd
Step 3: Access the Contents of your S3 location
Now that you've set your temporary credentials for AWS CLI, you can access the contents of your S3 location.
To perform these actions, you need to know the name of the GBDX S3 bucket and the PREFIX for your account.
List the contents of your S3 location
$ aws s3 ls s3://<bucket>/<prefix>/`
Download a single file from the S3 location
$ aws s3 cp s3://<bucket>/<prefix>/remote_text_file.txt
Download the contents of a folder or prefix
$ aws s3 cp s3://<bucket>/<prefix>/<folder> --recursive
- For a list of S3 file commands, see the AWS Command Line Interface Reference Page.
Updated about a year ago