Skip to main content

Snowflake SQL

To use your Snowflake warehouse for execution in Prophecy, you need to create a SQL fabric with a Snowflake connection.

Create a fabric

Fabrics define your Prophecy project execution environment. To create a new fabric:

  1. Click on the Create Entity button from the left navigation bar.
  2. Click on the Fabric tile.

Basic Info

Next, complete the fields in the Basic Info page.

  1. Provide a fabric title and description. It can be helpful to include descriptors like dev or prod in your title.
  2. Select a team to own this fabric. Click the dropdown to list the teams your user is a member. If you don’t see the desired team, ask a Prophecy Administrator to add you to a team.
  3. Click Continue.

SFBasicInfo

Provider

The SQL provider is both the storage warehouse and the execution environment where your SQL code will run. To configure the provider:

  1. Select SQL as the Provider type.
  2. Click the dropdown menu for the list of supported Provider types and select Snowflake.
  3. Add the Snowflake Account URL (for example, https://<org>-<account>.snowflakecomputing.com).
  4. Add the username that will be used to connect to the Snowflake Warehouse.
  5. Add the password that will be used to connect to the Snowflake Warehouse.

    info

    Each Prophecy user will provide their own username/password credential upon login. Prophecy respects the access scope of the Snowflake credentials, meaning users in Prophecy can read tables from each database and schema for which they have access in Snowflake. These username/password credentials are encrypted for secure storage.

  6. Add the Snowflake role that Prophecy will use to read data and execute queries on the Snowflake Warehouse. The role must be already granted to the username/password provided above.
  7. Specify the Snowflake warehouse for default writes for this execution environment.
  8. Specify the desired Snowflake database and schema where tables will be written to by default.
  9. Click Continue to complete fabric setup.

SFProvider

Schedule jobs

SQL models can be scheduled using Airflow to run on Snowflake. Create an Airflow fabric, and set up a Snowflake connection that references the Snowflake fabric created above.

What's next

Attach a fabric to your SQL project and begin data modeling!