Skip to main content

September 2023

3.2.0.* (September 14, 2023)

  • Prophecy Python libs version: 1.6.1
  • Prophecy Scala libs version: 7.1.14

Features

Introducing Package Hub

We are excited to introduce the Package Hub with Prophecy version 3.2. This feature allows platform leads to create and share standardized Pipeline components, including custom Gems, Subgraphs, Datasets, Pipelines, and more. For instance, you can now ensure consistent data encryption in line with your company's policies. Simply create a new Project, add the required custom encryption Gems and components, and publish it as a Package. Other teams can easily add this as a dependency and utilize the components from the Package.

info

Please note that all Prophecy-provided Gems now follow the same concept of Packages. With this release, your existing Projects will depend on Prophecy-provided packages for Gems. Your existing Pipelines and Projects should seamlessly work with this upgrade. If you encounter any issues, please don't hesitate to reach out to our support team.

Knowledge Graph - Connectors

When you navigate to the Environment tab in the Project browser, you can browse the Catalog tables from the Databricks Fabric to which you are connected. These Catalog Tables are automatically refreshed in the background using connectors in the Fabric. We'll refresh the tables according to the defined refresh frequency in your Connector settings. This ensures that users no longer need to wait and manually refresh the Environment Tab while creating or modifying Pipelines.

Copilot 2.0

We've worked hard to enhance your Copilot experience in version 3.2. With this release, we've added AI magic to the following features:

  • Generate User-Defined Functions (UDFs) for your Spark Projects with simple English prompts.
  • Incrementally add a Gem or a group of Gems to your Pipeline.
  • Generate custom Pyspark or Scala Spark code for your custom scripts in Pipelines.
  • Automatically generate labels and descriptions for Gems, Pipelines, Models, or even your entire Project's Readme.
  • Generate smart column metadata for any or all columns in your Datasets.

We believe these additions to Prophecy's Copilot capabilities will make managing and analyzing data a breeze.

DataProc Fabric

We're thrilled to announce that this release introduces first-class support for Dataproc. You can now create a Dataproc Fabric to run Pipelines interactively. With this addition, we now support all types of Spark execution environments.

DLT Support in SQL

With this release, we're adding support for Delta Live Tables in our Low Code SQL product. This is achieved by supporting Materialized views in our low-code SQL product. Simply set the materialized property in the DBT Defined Configs as shown below:

Guided private deployment

Prophecy Private Deployment now gives users the capability to install Prophecy securely within their Virtual Private Cloud (VPC). To simplify this process, we've introduced a dedicated deployment tool accessible through the settings page. This tool provides administrators with a comprehensive, step-by-step guide, ensuring a seamless deployment experience. It guides you through the setup of the necessary infrastructure and the deployment of Prophecy using Helm Charts. To enable Guided Private Deployment, please get in touch with our support team.

Minor Improvements

  • Enhanced UX for Project Metadata Screens : Our Project Metadata Screens have received a fresh look with this release, improving the overall Project browsing experience. Rest assured, all existing functionalities remain intact with this new look.

  • Code Generation Improvements: We're continually working to ensure that the code generated for your Spark Pipelines is of the highest quality. In this release, we've made some minor improvements to the code generation, so you may notice some small enhancements when you next open your Pipeline Code.

  • Option to Delete Branch Locally: We've added a new option to delete a branch from the local repository only, giving you more control over branch management.

  • Organize Models in Subdirectories: You can now better organize your models in our Low Code SQL product using subdirectories.