Note: AWS S3is currently supported only as a Destination. This guide doesn’t cover the DataPrep setup for AWS S3.
Description
AWS S3 is an object storage service that offers companies industry-leading scalability, data availability, security, and performance. This means that companies of all sizes can use it to store and protect data for a range of use cases, no matter the amount of data available.
Setup guide
Follow our setup guide to connect AWS S3 to Improvado.
Choose an encryption option
Choose a server-side encryption option:
SSE-S3 (recommended) - server-side encryption with AWS S3 managed keys. Learn more.
SSE-KMS - server-side encryption with AWS KMS keys. Learn more. Make sure to give the required permissions for the AWS-managed key.
SSE-KMS with customer-managed key - server-side encryption with customer-managed AWS KMS keys. Learn more. Make sure to give the required permissions for an encryption key.
No storage encryption - not recommended unless the AWS S3 bucket has default encryption.
Important: Make sure to provide your AWS S3 bucket information by following our guide.
Permissions
Enable the following permissions for your AWS S3 bucket:
Select the Partition by option from the dropdown. {%dropdown-button name="partition-by"%}
{%dropdown-body name="partition-by"%}
Partition is the way of splitting data for uploading to the file.
{%dropdown-end%}
Select the Encryption option from the dropdown. Learn more about all available encryption options here.
(SSE-KMS with customer-managed key) Enter the Encryption Key.
Select whether you want to Use load by account for this Destination. {%dropdown-button name="use-load-by-account"%}
{%dropdown-body name="use-load-by-account"%}
If enabled, the File name field must include ```{{account}}``` variable. You must enable this field if you want to use a specific account for data load.
{%dropdown-end%}
How to provide credentials to Improvado
There are three different ways to provide your credentials that you can choose depending on the type of selected Server-Side Encryption:
Option #2 (Share Read and Write access with Improvado’s AWS account)
{%docs-informer info%}
Available for SSE-S3 and SSE-KMS (with customer-managed keys) only.
{%docs-informer-end%}
Important: If you’re gonna use this option - notify our Support or your CSM about it and we will create specific users to load data and provide support. We’ll create a Destination connection for you.
Share Read and Write access with Improvado’s AWS Account ID:
Create an AWS S3 bucket.
Select the Permissions tab on the Bucket Settings page.
In the Bucket policy, click the Edit button.
Copy & paste the Policy example below.
~Change ```your-bucket-name```to your real Bucket name.
Note: AWS allows sharing only customer-managed KMS keys (keys that you created). The AWS-managed KMS key (the key that was created by AWS automatically) cannot be shared.
Method 1 (Add Improvado’s AWS account to KMS Key settings)
The KMS Key ID is required.
Create a KMS key.
Open the Key settings.
Go to Other AWS accounts and click the Add other AWS account button.
Provide your AWS S3 bucket information to Improvado (Option #2)
Provide our Support Team or your CSM with the following information:
Bucket Name
AWS Region
Our team will create specific users to load data and provide support.
Option #3 (Share access with Improvado account using our Canonical ID)
{%docs-informer info%}
Available for SSE-S3 only.
{%docs-informer-end%}
Important: If you’re gonna use this option - notify our Support or your CSM about it and we will create specific users to load data and provide support. We’ll create a Destination connection for you.
Share access with the Improvado account using our Canonical ID:
Create an AWS S3 bucket.
Select the Permissions tab on the Bucket Settings page.
In the Access control list (ACL), click the Edit button.
Click the Add grantee button in the Access for other AWS accounts section.
Paste ```5f2dfe7db1abc67daeea58fea77fb0c399ed924a2f88cf36d28738b7e1c838ef``` Canonical ID to the Grantee field.
```{{workspace_id}}``` and ```{{workspace_title}}``` are optional parameters that provide additional information about the workspace used for a destination connection
```{{data_source}}``` is a data provider, integration, connector.
```{{data_table_title}}``` is an object that contains all extraction orders with the same granularity (dimensional schema).
```{{report_type}}``` is a set of such fields as metrics, properties, dimensions, etc.
```{{account}}``` - is an optional parameter that allows you to to add specific account for the data load.
~ You must enable Use load by account field to add this parameter to the File name.
If you use ```/{{YYYY}}/{{MM}}/{{DD}}``` settings, the data will be added to folders daily. Each new record will not delete the previous one, even for data that contains no date.