This short guide explains how to connect Akave Cloud to Rclone, a command-line tool for managing cloud storage. In the demo, we demonstrate how to configure Akave Cloud as a remote in Rclone and migrate files from AWS S3 into Akave Cloud using a few simple commands.
The integration provides a quick way to move or sync your data from major cloud providers (AWS, GCP, Azure) into Akave’s verifiable, decentralized, and cost-efficient object storage.
Why Use Rclone with Akave Cloud?
Rclone is often called the “Swiss Army knife of cloud storage” because it simplifies complex operations such as file synchronization, migration, encryption, and backup across local and remote storage. It can perform incremental syncing to save bandwidth, resume interrupted transfers, and supports server-side operations to minimize local bandwidth use when moving files between cloud providers.
Key benefits of combining Rclone and Akave Cloud:
- Simple Migration: Move entire buckets or datasets from any major cloud to Akave using a single command.
- Full S3 Compatibility: Akave supports the same bucket and object structure as AWS, so no schema or path changes are needed.
- Immutable and Verifiable: Data becomes cryptographically verifiable and tamper-proof once migrated.
- No Lock-In: You can use the same CLI workflows you already know.
Prerequisites
Before starting, make sure you have:
- Rclone CLI installed
- Akave Cloud credentials (Access Key ID, Secret Access Key, and Endpoint URL)
- Access to your source bucket (e.g., AWS S3, GCP, or Azure)
Step-by-Step: Configuring Akave Cloud as a Remote
1. Open Rclone Configuration
In your terminal, run:
rclone configSelect n to create a new remote, then give it a name — for example:
name> akave
2. Choose the Storage Type
When prompted for the type of storage, choose AWS S3-compatible.
Storage> 4 # AWS S3
provider> 34 # Any other S3 compatible storageThis allows Rclone to connect to Akave’s S3-compatible API.
3. Add Your Akave Credentials
When prompted to “Get AWS credentials from runtime” select “false” or hit “Enter” on your keyboard as this is the default:
env_auth> 1 # falseEnter your Access Key ID and Secret Access Key when prompted:
Access Key ID> [your_akave_access_key]
Secret Access Key> [your_akave_secret_key]You can leave the region field blank, then enter your Akave Endpoint URL, for example:
Endpoint> https://[your-akave-endpoint]When asked about advanced options or auto configuration, choose the defaults and confirm.
✅ You now have a new remote named akave.
Step-by-Step: Migrating Data to Akave Cloud
4. Create a New Bucket on Akave
Create a new bucket, for example: (named bart-rclone-bucket) directly from the CLI:
rclone mkdir akave:bart-rclone-bucketThis creates an empty destination bucket on Akave Cloud.
5. Sync Files from AWS S3 to Akave
With both remotes configured, use the sync command to transfer all files from your source (e.g., AWS S3) to Akave:
rclone sync s3:your-source-bucket akave:bart-rclone-bucket --progressThe --progress flag shows a live view of the transfer process.
✅The files from the AWS S3 bucket are copied over to the Akave bucket.
6. Verify the Transfer
Once the sync completes, you can confirm that the files exist in your Akave bucket. You can use the AWS CLI to list and verify that all files are now stored in Akave Cloud.
Summary
That’s it, your data has been successfully migrated from AWS S3 to Akave Cloud using Rclone.
You can now:
- Continue syncing updates with the same command
- Manage your data from the Rclone CLI
- Enjoy verifiable, immutable, and cost-efficient storage across the Akave network
▶️ Watch the demo: How to Migrate from AWS, GCP & Azure to Akave Cloud Using Rclone

