Orchestration
 Enterprise
EnterpriseAvailable for Enterprise Edition only.
Once you have developed a Spark data pipeline or an SQL model using Prophecy, you may want to schedule it to run at some regular frequency. To support this, Prophecy provides you with an easy-to-use interface to develop jobs that run on external schedulers.
- Databricks Jobs
- Apache Airflow DAGs
- Custom
What's next
To continue exploring orchestration solutions, see the following pages:
Pipeline monitoring
2 items
Databricks Jobs
Databricks jobs
Alternative Schedulers
Support for Alternative Orchestration Solutions
Multi Jobs Trigger
Complex pipeline interactions and timing