Python SDK
Edge does not provide its own Python SDK. For Storage, use boto3. For Compute, CDN, and DNS, use the REST API with requests.
How it works
Edge Storage is S3-compatible. Use boto3
(the AWS SDK for Python) with Edge's endpoint and credentials. For Compute, CDN, and DNS, call the REST API at
https://api.edge.network
using the requests library.
Storage: Installing and configuring
Install boto3 (and requests for REST API calls):
pip install boto3 requests Configure the S3 client with Edge's Storage endpoint (region us-east-1, path-style access):
import boto3
import os
s3 = boto3.client(
's3',
endpoint_url='https://storage.edge.network',
region_name='us-east-1',
aws_access_key_id=os.environ['EDGE_ACCESS_KEY'],
aws_secret_access_key=os.environ['EDGE_SECRET_KEY'],
config=boto3.session.Config(
signature_version='s3v4',
s3={'addressing_style': 'path'}
)
) Create the bucket with a location constraint or omit it for Edge:
s3.create_bucket(Bucket='my-bucket') Storage operations
Upload a file:
s3.upload_fileobj(
open('local/file.txt', 'rb'),
'my-bucket',
'backups/file.txt',
ExtraArgs={'ContentType': 'text/plain'}
) List objects:
paginator = s3.get_paginator('list_objects_v2')
for page in paginator.paginate(Bucket='my-bucket', Prefix='backups/'):
for obj in page.get('Contents', []):
print(obj['Key']) Download a file:
s3.download_file('my-bucket', 'backups/file.txt', 'local/file.txt') Generate a presigned URL:
url = s3.generate_presigned_url(
'get_object',
Params={'Bucket': 'my-bucket', 'Key': 'backups/file.txt'},
ExpiresIn=3600
) Compute, CDN, DNS: REST API with requests
For non-Storage products, use requests:
import requests
resp = requests.get(
'https://api.edge.network/api/compute/vms',
headers={
'Authorization': f"Bearer {os.environ['EDGE_API_KEY']}",
'Content-Type': 'application/json'
}
)
data = resp.json() Use the same pattern for CDN, DNS, and VM operations. See the API Reference for endpoints.
Practical example: Backup script
This script uploads local files to Edge Storage with dated prefixes:
from pathlib import Path
from datetime import datetime
def backup_to_edge(local_dir: str, bucket: str, prefix: str = 'backups'):
"""Upload directory tree to Edge Storage with timestamp prefix."""
base = Path(local_dir)
timestamp = datetime.utcnow().strftime('%Y%m%d-%H%M%S')
for path in base.rglob('*'):
if path.is_file():
key = f"{prefix}/{timestamp}/{path.relative_to(base)}"
key = key.replace('\\', '/')
s3.upload_file(str(path), bucket, key)
print(f"Uploaded {path} -> {key}")
backup_to_edge('./data', 'my-bucket') Run it via cron for scheduled backups. You can add lifecycle rules or expiry in the control panel to retain only recent backups.