Skip to main content

Orchestration

Once you have developed a Spark data Pipeline or an SQL Model using Prophecy, you will want to schedule it to run at some frequency. To support this, Prophecy provides you with an easy-to-use interface to develop Jobs, using two different schedulers:

  1. Databricks Jobs - for simpler data-Pipeline use-cases, where you just orchestrate multiple data-Pipelines to run together. Databricks Jobs is a recommended scheduler, if you're Databricks Native.

  2. Airflow - for more complex use-cases, where you have to use various operators, or need any additional data pre-and-post-processing, you can interface from Prophecy with your production-ready Airflow deployment. To get started with your first Airflow jobs, try Prophecy Managed Airflow using this guide.

  3. Custom - Alternatively, since Prophecy provides you native Spark code on Git, you can easily integrate with any other scheduler or custom solution.

What's next

To continue using orchestration solutions, see the following pages: