Skip to main content

Snowflake connection

Learn how to set up and use a Snowflake connection in Prophecy. With a Snowflake connection, you can read from and write to your Snowflake data warehouse using Source and Target gems, browse data in the Environment browser, and run pipelines that process Snowflake data.

Prerequisites

When you create a Snowflake connection in Prophecy, all data operations—such as reading or writing—are executed using the Snowflake credentials you provide. Ensure that your Snowflake user has the following permissions:

  • SELECT, INSERT, UPDATE, and DELETE on the tables used in your Prophecy pipelines.
  • OWNERSHIP on the table, if Prophecy needs to alter or replace it.

Additionally, Prophecy writes data to Snowflake by uploading Parquet files to a stage. This requires:

  • CREATE FILE FORMAT in the target schema.
  • USAGE on any file formats used for reading/writing Parquet files.
  • Write access to your user stage

Feature support

The table below outlines whether the connection supports certain Prophecy features.

FeatureSupported
Read data with a Source gemYes
Write data with a Target gemYes
Browse data in the Environment browserYes

Connection parameters

To create a connection with Snowflake, enter the following parameters:

ParameterDescription
Connection NameName to identify your connection
AccountURL of your Snowflake account
Example: https://<orgname>-<account_name>.snowflakecomputing.com
DatabaseDefault database for reading and writing data
SchemaDefault schema for reading and writing data
WarehouseName of the SQL warehouse to use for the connection
RoleSnowflake role of the user to connect
Example: ACCOUNTADMIN
Authentication methodEnter your Snowflake username and use a secret to enter your password.

Data type mapping

When Prophecy processes data from Snowflake using SQL warehouses, it converts Snowflake-specific data types to formats compatible with your target warehouse. This table shows how Snowflake data types are transformed for Databricks and BigQuery.

SnowflakeDatabricksBigQuery
NUMBERBIGINT
Alias: Bigint
INT64
Alias: Integer
INTEGERBIGINT
Alias: Bigint
INT64
Alias: Integer
BIGINTBIGINT
Alias: Bigint
INT64
Alias: Integer
SMALLINTBIGINT
Alias: Bigint
INT64
Alias: Integer
TINYINTBIGINT
Alias: Bigint
INT64
Alias: Integer
FLOATDOUBLE
Alias: Double
FLOAT64
Alias: Float
DOUBLEDOUBLE
Alias: Double
FLOAT64
Alias: Float
REALDOUBLE
Alias: Double
FLOAT64
Alias: Float
DECIMALDOUBLE
Alias: Double
FLOAT64
Alias: Float
NUMERICDOUBLE
Alias: Double
FLOAT64
Alias: Float
BOOLEANBOOLEAN
Alias: Boolean
BOOL
Alias: Boolean
VARCHARSTRING
Alias: String
STRING
Alias: String
CHARSTRING
Alias: String
STRING
Alias: String
STRINGSTRING
Alias: String
STRING
Alias: String
TEXTSTRING
Alias: String
STRING
Alias: String
DATEDATE
Alias: Date
DATE
Alias: Date
TIMESTRING
Alias: String
TIME
Alias: Time
DATETIMETIMESTAMP
Alias: Timestamp
TIMESTAMP
Alias: Timestamp
TIMESTAMP_NTZTIMESTAMP
Alias: Timestamp
TIMESTAMP
Alias: Timestamp
TIMESTAMP_LTZTIMESTAMP
Alias: Timestamp
TIMESTAMP
Alias: Timestamp
TIMESTAMP_TZTIMESTAMP
Alias: Timestamp
TIMESTAMP
Alias: Timestamp
BINARYBINARY
Alias: Binary
BYTES
Alias: Bytes
VARBINARYBINARY
Alias: Binary
BYTES
Alias: Bytes
VARIANTSTRING
Alias: String
STRING
Alias: String
OBJECTSTRING
Alias: String
STRING
Alias: String
ARRAYSTRING
Alias: String
STRING
Alias: String
NULLSTRING
Alias: String
STRING
Alias: String
info

Learn more in Supported data types.

Limitations

There are a few limitations on the data types you can read from Snowflake:

  • Prophecy reads Object, Array, and Variant types as String type.
  • Prophecy does not support writing Binary type columns.

Sharing connections within teams

Connections in Prophecy are stored within fabrics, which are assigned to specific teams. Once a Snowflake connection is added to a fabric, all team members who have access to the fabric can use the connection in their projects. No additional authentication is required—team members automatically inherit the access and permissions of the stored connection credentials.

caution

Be mindful of the access level granted by the stored credentials. Anyone on the team will have the same permissions—including access to sensitive data if allowed.

To manage this securely, consider creating a dedicated fabric and team for high-sensitivity connections. This way, only approved users have access to those credentials.

Fetching data

Prophecy fetches data from Snowflake connections in the following ways:

  • When you browse a Snowflake connection in the Environment browser, Prophecy fetches data on demand as you expand folders. You can manually refresh the Environment browser to see updated files.

  • When a pipeline runs, Source gems will read the latest available version of the data. If the schema of your data in Snowflake changes, Prophecy will automatically use the new schema.