Comment on page

Manage the S3 protocol

The WEKA configuration of the S3 protocol.
The S3 protocol is integral to numerous cloud-native applications, and within the WEKA system, it offers a range of technical capabilities:
  • Data management:
    • Ingest data using the S3 protocol.
    • Access ingested data using S3 or other supported protocols.
  • Data exposure to S3:
    • Enable seamless migration of applications within the WEKA data platform by making existing data accessible through the S3 protocol. This integration ensures a smooth transition without the need for data relocation.
  • Cloud integration:
    • Enable cloud bursting to use new applications without relocating data.
  • Multi-protocol access with WEKA:
    • Leverage WEKA's scale, performance, and resiliency advantages.
    • Gradually transition applications to S3, maintaining data access through multiple protocols: POSIX, S3, SMB, NFS, and GPUDirect Storage.
The WEKA S3 service is designed for scalability and resilience. Implementation involves specifying WEKA servers running the S3 protocol and creating a logical S3 cluster to expose the service. Scalability is achieved through multiple servers, and load balancing or round-robin DNS integration facilitates access by numerous clients.
The WEKA S3 service builds on the WEKA filesystem service, mapping buckets to top-level directories and objects to files, allowing data exposure through various WEKA-supported protocols.

Guidelines for managing S3 access and security

S3 access

Users can access S3 APIs with either authenticated or anonymous methods.
  • Authenticated S3 access: To gain authenticated S3 access, follow these guidelines:
    • Create a local WEKA user with an assigned S3 user role.
    • Attach an IAM policy to the S3 user, specifying S3 operations and resource permissions.
    • S3 users can generate temporary security tokens (STS AssumeRole) or employ service accounts with restricted permissions.
  • Anonymous access: You can use the following options:
    • Bucket policies.
    • Pre-signed URLs.

S3 security

  • Encryption of data at rest:
    • Data written through the S3 protocol can be encrypted at rest by configuring an encrypted filesystem.
  • Transport Layer Security (TLS):
    • Clients access the service securely through HTTPS, using the same certificates as WEKA's other API access points.

S3 audit

  • Auditing S3 API calls:
    • S3 API calls are auditable using an HTTP webhook service, integrating into applications such as Splunk.
  • Setting audit targets:
    • Set an audit target using the weka s3 cluster audit-webhook enable CLI command.

Workflow: Managing S3 resources in WEKA

This workflow guides you through the process of setting up and managing S3 resources within the WEKA system. It covers essential procedures, from creating an S3 cluster and bucket to uploading and downloading objects from S3 buckets. Each step demonstrates the seamless integration of WEKA's capabilities in handling data through the S3 protocol.
Select each tab in the specified order to explore the demonstrations.
1. Create an S3 cluster
2. Create an S3 bucket
3. Create a local S3 user
4. Obtain the certificate for the client
5. Test the upload and download objects from S3 buckets
Create an S3 cluster within the WEKA environment, specifying the servers to run the S3 protocol and establishing a logical cluster for exposing the S3 service.
A predefined configuration filesystem (.config.fs in this demo) must exist to maintain the persisting cluster-wide protocol configurations.
Create an S3 cluster
Alternative CLI command
weka s3 cluster create default .config_fs --all-servers
Related topics
Create an S3 cluster (using the GUI)
Create an S3 cluster (using the CLI)
Create of an S3 bucket, a fundamental container for storing and organizing objects within the WEKA S3 service.
Create an S3 bucket
Alternative CLI command
weka s3 bucket create mybucket
Related topics
Create a bucket (using the GUI)
Create a bucket (using the CLI)
Create a dedicated local user with S3 role and assign it with an S3 policy to facilitate secure access to S3 resources.
The S3 local user name and password serve as the S3 access key and secret key, respectively for uploading and downloading objects from S3 buckets.
Create a local S3 user
Alternative CLI commands
weka user add S3_user S3
weka s3 policy attach readwrite
Related topics
Create a local user (using the GUI)
Create a local user (using the CLI)
You can implement either the self-signed certificate or a custom certificate. The cluster TLS certificate is enabled using an auto-generated self-signed certificate, providing access to the GUI, CLI, and API through HTTPS. If a custom TLS certificate is preferred, you can set in place of the auto-generated self-signed certificate.
To apply your chosen certificate, download it to your client and place it in the designated folder for seamless integration.
Alternative CLI command
weka security tls download
Related topics
Once we have created an S3 cluster that can be accessed through a certain port (default: 9000), a bucket, and an S3 user assigned with a policy, let's see how the client can upload and download objects from the S3 bucket.
Users can use any client application that can access the bucket through the WEKA cluster URL and port.
This example demonstrates using Boto3, the official Python client of AWS.
In the code snippet named, we set the following parameters in the S3 = boto3 resource section:
  • endpoint_url and port: The URL and port of the WEKA S3 cluster.
  • aws_access_key_id: The S3 local user name.
  • aws_secret_access_key: The S3 local user password.
import boto3
import logging
from botocore.exceptions import ClientError
from botocore.client import Config
config = Config(
signature_version = 's3v4'
s3 = boto3.resource('s3',
# upload a file from the local filesystem 'myfile' to bucket 'mybucket1' with 'my_uploaded_object' as the object name.
# download the object 'myfile' from the bucket 'mybucket1' and save it to the local filesystem as my_downloaded_object.
s3.Bucket('mybucket1').download_file('my_uploaded_object', 'my_downloaded_object')
except ClientError as e:
print("Downloaded 'my_downloaded_object' as 'my_uploaded_object'. a")
Upload and download
Test the upload and download using the Python script.
❯ python3
Downloaded 'my_downloaded_object' as 'my_uploaded_object'. a
❯ cat my_downloaded_object
Hello World!
Related topic
As a quick workflow guide, here is a summary of the CLI commands for configuring your S3 cluster and implementing the desired settings:
weka s3 cluster create default .config_fs --all-servers
weka s3 bucket create mybucket
weka user add S3_user S3
weka s3 policy attach readwrite
weka security tls download