Skip to main content

Low-code Jobs

Once you have developed a Spark data Pipeline or an SQL Model using Prophecy, you will want to schedule it to run at some frequency. To support this, Prophecy provides you with an easy to use low-code interface to develop Jobs, using two different schedulers:

  1. Databricks Jobs - for simpler data-Pipeline use-cases, where you just orchestrate multiple data-Pipelines to run together. Databricks Jobs is a recommended scheduler, if you're Databricks Native.

  2. Airflow - for more complex use-cases, where you have to use various operators, or need any additional data pre-and-post-processing, you can interface from Prophecy with your production-ready Airflow deployment. To get started with your first Airflow jobs, try Prophecy Managed Airflow using this guide.

Alternatively, since Prophecy provides you native Spark code on Git, you can easily integrate with any other scheduler.