Snowflake enables data storage, processing, and analytic solutions that are faster, easier to use, and far more flexible than traditional offerings.
Snowflake’s Data Cloud is powered by an advanced data platform provided as a self-managed service. MetaRouter allows clients to pass event and identity to their chosen cloud storage account in real time. By ensuring that compliant and correctly configured event data goes into the cloud data storage and then to the data warehouse MetaRouter can reduce processing time and improve quality. MetaRouter can also offer enterprise level client access to additional vendor ID’s that might be used for attribution, identity graphing or 2nd party data sales.
MetaRouter offers two options for integrations one via our fully integrated S3 option in the UI integrations for SnowFlake or by picking a cloud storage account and flowing Snowflakes instructions. For non S3 cloud accounts clients will need to pick their desired cloud storage account, connect their events data via the MetaRouter cloud storage account and then within the cloud storage account connect Snowflake. Using these steps clients can quick enabled their eventing data into SnowFlake for processing. Or the client can chose to define their desired file and once the desired file size has been completed the data storage will “respond” with the URL. Difference being we send a request and the response immediately comes back with the URL to the file, like a webhook. This notified us to push the file to Snowflake via the Snowpipe API.
MetaRouter Snowflake Setup Instructions
Setup within Snowflake UI
Within Snowflake go to Worksheets, create a new SQL worksheet, and add the following
USE ROLE SECURITYADMIN;
CREATE USER mr_user;
ALTER USER mr_user SET PASSWORD='ADD_NEW_PASSWORD_HERE';
ALTER USER mr_user SET RSA_PUBLIC_KEY='ADD_RSA_PUBLIC_KEY_HERE';
USE ROLE SYSADMIN;
CREATE DATABASE METAROUTER_MR_STARTER_KIT_TESTING;
USE DATABASE METAROUTER_MR_STARTER_KIT_TESTING;
CREATE SCHEMA metarouter;
USE SCHEMA METAROUTER_MR_STARTER_KIT_TESTING.metarouter;
CREATE TABLE METAROUTER_MR_STARTER_KIT_TESTING.metarouter.events(raw VARIANT);
CREATE WAREHOUSE METAROUTER_MR_STARTER_KIT_TESTING_WAREHOUSE WAREHOUSE_SIZE=xsmall AUTO_SUSPEND=300;
CREATE STAGE events_stage
URL='s3://ADD_BUCKET_NAME_HERE/'
CREDENTIALS=(AWS_KEY_ID='ADD_AWS_KEY_ID_HERE' AWS_SECRET_KEY='ADD_AWS_SECRET_KEY_HERE');
CREATE PIPE METAROUTER_MR_STARTER_KIT_TESTING.metarouter.events_pipe
AS COPY INTO METAROUTER_MR_STARTER_KIT_TESTING.metarouter.events
FROM @METAROUTER_MR_STARTER_KIT_TESTING.metarouter.events_stage
FILE_FORMAT=(type='JSON');
USE ROLE SECURITYADMIN;
CREATE ROLE MR_USER;
GRANT ROLE MR_USER TO USER mr_user;
USE ROLE SYSADMIN;
GRANT ALL PRIVILEGES ON PIPE METAROUTER_MR_STARTER_KIT_TESTING.metarouter.events_pipe
TO ROLE MR_USER;
GRANT ALL PRIVILEGES ON SCHEMA METAROUTER_MR_STARTER_KIT_TESTING.metarouter
TO ROLE MR_USER;
GRANT USAGE ON DATABASE METAROUTER_MR_STARTER_KIT_TESTING
TO ROLE MR_USER;
USE ROLE SECURITYADMIN;
ALTER USER mr_user SET
DEFAULT_WAREHOUSE='METAROUTER_MR_STARTER_KIT_TESTING_WAREHOUSE'
DEFAULT_ROLE='MR_USER';
Make sure to replace all ADD_XXXXXXXXX_HERE values with your values:
ADD_NEW_PASSWORD_HERE
— set a password for the newly created userADD_RSA_PUBLIC_KEY_HERE
— follow this link to generate your private and public keysADD_BUCKET_NAME_HERE
— S3 bucket nameADD_AWS_KEY_ID_HERE
— your AWS Key IDADD_AWS_SECRET_KEY_HERE
— your AWS Secret Key
Run All commands above
Setup within MetaRouter UI
USER
The user name from your Snowflake CREATE USER command-
PASSWORD
The password from your Snowflake .. SET PASSWORD.. command ACCOUNT
The account value is defined by your Snowflake <account id>-<organisation id> in lowercaseDATABASE
The user name from your SnowflakeCREATE DATABASE
commandWAREHOUSE
The user name from your Snowflake CREATE WAREHOUSE commandPRIVATEKEY
The private key you generated while creating the public key from the previous step
Setup within S3 UI
REGION
The region of your AWS bucketBUCKET
Your S3 bucket nameACCESS_KEY
Your AWS Access Key IDSECRET_KEY
Your AWS Secret Key ID
What do I need to integrate for other cloud storage accounts?
Chose and Setup Cloud Storage Account
- Setup Amazon’s S3 - https://aws.amazon.com/pm/serv-s3/
- Setup Google Cloud Storage - https://support.google.com/cloud/answer/6250993?hl=en
- Setup Microsoft Azure - https://learn.microsoft.com/en-us/azure/cloud-adoption-framework/ready/azure-setup-guide/
Connect MetaRouter Integration to Cloud Storage Account
Once the clients desired cloud storage account is chosen and setup. Use the MetaRouter Integration to pipe data to the cloud storage account.
- Setup the MetaRouter Amazon’s S3 integrations - https://docs.metarouter.io/docs/amazon-s3
- Setup the MetaRouter Google Cloud Storage integration - https://docs.metarouter.io/docs/google-cloud-storage
- Setup the MetaRouter Microsoft Azure Blob integration - https://docs.metarouter.io/docs/blob
Configure Snowflake
Configure Snowflake to read and consume via the chosen cloud storage account
- Configure Snowflake to Consume Amazon’s S3 - https://docs.snowflake.com/en/user-guide/data-load-s3
- Configure Snowflake to Consume Google Cloud Storage - https://docs.snowflake.com/en/user-guide/data-load-gcs
- Configure Snowflake to Consume Microsoft Azure - https://docs.snowflake.com/en/user-guide/data-load-azure
This will allow data to flow from your digital property, process through MetaRouter’s cloud storage integration, and ultimately be consumed (on a schedule of your choosing within Snowflake’ UI) into your Snowflake warehouse.