Skip to main content

Snowflake targets

To configure a target model that will be written to Snowflake, reference the following sections.

Type & Format

Snowflake supports the following materialization types for target models. The type determines the underlying physical format of your target model.

Materialization typeDescription
View (default)Rebuilt as a view on each run. Views always reflect the latest source data but don’t store any data themselves.
TableRebuilt as a table on each run. Tables are fast to query but can take longer to build. Target tables support multiple write modes.
EphemeralNot built in the database. The model’s logic is inlined into downstream models using a common table expression (CTE). Use for lightweight transformations early in your DAG.

Location

Review the location where your model will be written. Any changes you make to the Overwrite location section will be reflected in the Location that Prophecy generates.

Location ParameterDescriptionAdvanced mode
CatalogCatalog where the model will be created.Yes
SchemaSchema inside the catalog where the model will be created.Yes
AliasSets the name of the resulting table or view. Defaults to the model name if not specified.No

Schema

Define the schema of the dataset and optionally configure additional properties.

The schema includes column names, column data types, and optional column metadata. When you expand a row in the Schema table, you can add a column description, apply column tags, and enable/disable quoting for column names.

Properties

Each property maps to a certain dbt configuration that may be generic to dbt or specific to a platform like Snowflake. If you do not add a property explicitly in the Schema tab, Prophecy uses the dbt default for that property.

PropertyDescriptionConfig type
Dataset TagsAdd tags to the dataset. These tags can be used as part of the resource selection syntax in dbt.Generic
Contract EnforcedEnforce a contract for the model schema, preventing unintended changes.Generic
Show DocsControl whether or not nodes are shown in the auto-generated documentation website.Generic
EnabledControl whether the model is included in builds. When a resource is disabled, dbt will not consider it as part of your project.Generic
MetaSet metadata for the table using key-value pairs.Generic
GroupAssign a group to the table.Generic
Persist Docs ColumnsSave column descriptions in the database.Generic
Persist Docs RelationsSave model descriptions in the database.Generic
Cluster ByOrder and cluster the table by specified keys to reduce Snowflake’s automatic clustering work. Accepts a string or list of strings.Snowflake
TransientWrite the table as a Snowflake transient table, which does not preserve history and can reduce costs. By default, all Snowflake tables created by dbt are transient.Snowflake
Automatic ClusteringEnables automatic clustering if manual clustering is enabled for your Snowflake account. You do not need to use this property if automatic clustering is enabled by default on your Snowflake account.Snowflake
Snowflake WarehouseOverride the Snowflake warehouse that is used for this target model.Snowflake
Query TagAdd query tags to your model.Snowflake
Copy GrantsEnable to preserve any existing access control grants (like GRANT SELECT TO ROLE analyst) when rebuilding the table or view.Snowflake
SecureCreate a secure view that hides underlying data from unauthorized users.Snowflake
Target LagDefines how frequently the table should be automatically refreshed.Snowflake
info

For more detailed information, see the dbt reference documentation.

SQL Query

Add a custom SQL query at the end of your target model using the Snowflake SQL dialect. This allows you to apply a final transformation step, which can be useful if you're importing an existing codebase and need to add conditions or filters to the final output. Custom queries support Jinja, dbt templating, and variable usage for your last-mile data processing.

You can reference any column present in the list of input ports beside the SQL query. You can only add additional input ports—the output port cannot be edited.

Write Options

For a complete guide to defining how to write target tables, visit Write Options.

Data Tests

A data test is an assertion you define about a dataset in your project. Data tests are run on target models to ensure the quality and integrity of the final data that gets written to the warehouse. Learn how to build tests in Data tests.