Databricks SQL
To use your Databricks SQL warehouse for execution in Prophecy, you need to create a SQL fabric with a Databricks connection.
Create a fabric
Fabrics define your Prophecy project execution environment. To create a new fabric:
- Click on the Create Entity button from the left navigation bar.
- Click on the Fabric tile.
Basic Info
Next, complete the fields in the Basic Info page.
- Provide a fabric title and description. It can be helpful to include descriptors like
dev
orprod
in your title. - Select a team to own this fabric. Click the dropdown to list the teams your user is a member. If you don’t see the desired team, ask a Prophecy Administrator to add you to a team.
- Click Continue.
Provider
The SQL provider is both the storage warehouse and the execution environment where your SQL code will run. To configure the provider:
- Select SQL as the Provider type.
- Click the dropdown menu for the list of supported Provider types, and select Databricks.
- Copy the JDBC URL from the Databricks UI as shown. This is the URL that Prophecy will connect for SQL Warehouse data storage and execution.
noteIf using self-signed certificates, add
AllowSelfSignedCerts=1
to your JDBC URL. - Add a personal access token (PAT) that will let Prophecy connect to Databricks. Each user supplies their own token when using the fabric. To generate a PAT, follow the Databricks documentation.
- Optional: Enter the Catalog name if you are using Unity Catalog.
- Click Continue.
Prophecy respects individual user credentials when accessing Databricks catalogs, tables, databases, etc.
Prophecy supports Databricks Volumes. When you run a Python or Scala pipeline via a job, you must bundle them as whl/jar artifacts. These artifacts must then be made accessible to the Databricks job in order to use them as a library installed on the cluster. You can designate a path to a Volume for uploading the whl/jar files under Artifacts.
Optional: Connections
If you want to crawl your warehouse metadata on a regular basis, you can set a connection here.
What's next
Attach a fabric to your SQL project and begin data modeling!