Skip to main content

Azure Data Lake Storage connection

Prophecy supports direct integration with Azure Data Lake Storage (ADLS), allowing you to read from and write to ADLS containers as part of your data pipelines. This page explains how to configure the connection, what permissions are required, and how ADLS connections are managed and shared within your team.

Prerequisites

Prophecy connects to ADLS using the credentials you provide. These are used to authenticate requests and authorize all file operations during pipeline execution.

To ensure Prophecy can read from and write to your storage account, you must have the following Azure RBAC role or equivalent permissions:

  • Storage Blob Data Contributor: Read, write, and delete access to Blob storage containers and blobs.

To learn more, see Access control model in Azure Data Lake Storage.

Feature support

The table below outlines whether the connection supports certain Prophecy features.

FeatureSupported
Read data with a Source gemYes
Write data with a Target gemYes
Browse data in the Environment browserYes
Trigger scheduled pipeline upon file arrival or changeYes

Connection parameters

To create a connection with your ADLS account, enter the following parameters:

ParameterDescription
Connection NameUnique name for the connection.
Client IDYour Microsoft Entra app client ID
Tenant IDYour Microsoft Entra tenant ID
Client Secret (Secret required)Your Microsoft Entra app client secret
Account NameName of your ADLS storage account that hosts the container
Container NameName of the container within the storage account

Sharing connections within teams

Connections in Prophecy are stored within fabrics, which are assigned to specific teams. Once an ADLS connection is added to a fabric, all team members who have access to that fabric can use the connection in their projects. No additional authentication is required—team members automatically inherit the access and permissions of the stored credentials.