In just a few steps, you’ll learn how to:
- Create an external stage on Akave Cloud
- Copy data from Snowflake to Akave
- Query external data directly from Snowflake
All without changing your SQL or storage workflows.
What Are Snowflake External Tables?
Snowflake external tables allow you to query data stored outside of Snowflake, such as files in Amazon S3, Azure Blob Storage, Google Cloud Storage, or any S3-compatible provider like Akave Cloud. This means you do not need to load the data into Snowflake’s internal storage, but run queries directly against files in the external object store.
Before querying external data, you first create an external stage in Snowflake, which acts as a reference point or pointer to the data stored externally. The external stage represents the connection between Snowflake and the object storage system; the files themselves reside outside Snowflake, in the location defined by the stage.
External tables provide a read-only SQL interface on top of files, such as CSV, Parquet, or JSON. In practice, you can analyze large datasets directly from object storage for greater cost efficiency and flexibility.
This approach is ideal for organizations wanting to offload storage expenses, integrate with existing data lakes, or quickly access new data sources without data movement
Prerequisites
Before starting, make sure you have:
- An active Snowflake account
- Akave Cloud credentials (access key, secret key, and bucket name)
- Akave’s S3-compatible endpoint URL
Step-by-Step Integration Guide
1. Create an S3-Compatible External Stage
In Snowflake, create a stage pointing to your Akave bucket.
You can do this directly in your Snowflake console or worksheet:
CREATE STAGE akave_stage
URL = 's3compat://my-snowflake-bucket/'
ENDPOINT = 'o3-rc2.akave.xyz'
CREDENTIALS = (AWS_KEY_ID = '1a2b3c...' AWS_SECRET_KEY = '4x5y6z...')
DIRECTORY = ( ENABLE = true );
;This tells Snowflake to treat your Akave bucket as external storage.
2. Unload Sample Data to Akave
Next, copy some sample data from Snowflake into your Akave stage.
COPY INTO @akave_stage
FROM SNOWFLAKE_SAMPLE_DATA.TPCH_SF1.NATION
FILE_FORMAT = (TYPE = CSV, COMPRESSION = GZIP)
OVERWRITE = TRUE;✅ You should see a confirmation that 25 rows (or more) were successfully unloaded to your Akave bucket.
3. Create a CSV File Format
Tell Snowflake how to interpret the data in your Akave stage.
CREATE FILE FORMAT akave_csv_format
TYPE = CSV
COMPRESSION = GZIP;
4. Create an External Table
Now, create an external table in Snowflake that references the data stored in Akave.
CREATE OR REPLACE EXTERNAL TABLE akave_external_customer (
N_NATIONKEY NUMBER AS (VALUE:c1::NUMBER),
N_NAME STRING AS (VALUE:c2::STRING),
N_REGIONKEY NUMBER AS (VALUE:c3::NUMBER),
N_COMMENT STRING AS (VALUE:c4::STRING)
)
WITH LOCATION = @akave_stage
FILE_FORMAT = (FORMAT_NAME = akave_csv_format)
AUTO_REFRESH = FALSE
REFRESH_ON_CREATE = TRUE;
;This table exists in Snowflake but queries data directly from Akave Cloud.
5. Query Data from Akave
Finally, run a query to verify your integration:
SELECT * FROM akave_external_customer LIMIT 10;You’ll see results populate instantly, your data is being pulled directly from Akave storage via Snowflake’s compute engine.
Optional: Using Iceberg Tables
For high-performance or versioned workloads, Akave Cloud also supports Apache Iceberg tables on Snowflake. Iceberg tables enable schema evolution, time travel, and optimized query performance, ideal for analytics and ML pipelines. These must be created with external volumes as opposed to external tables, and you can read more about how to use these with Snowflake in our docs: Iceberg Configuration on Snowflake
Summary
That’s it, you’ve successfully connected Akave Cloud with Snowflake.
You can now offload large datasets, run analytics directly from external storage, and save significantly on cloud costs, all while maintaining full compatibility and verifiable data integrity.
▶️ Watch the demo: Using Akave Cloud with Snowflake

