The Varigence Blog
Tag - DataFlowMapping
As part of ongoing improvements for our Mapping Data Flows feature, we are adding the first (of many) extension points.
It is now possible to configure specific directory structures for your Delta Lake using BimlFlex.
Loading data from unsupported data sources in Mapping Data Flows
BimlFlex will soon support custom transformations to be generated for Mapping Data Flows.
A real-world data solution often requires a combination of many different data loading patterns. Using BimlFlex, these patterns are applied by configuring the design to work in certain ways. This post shows how to configure BimlFlex to achieve different Staging pattern outcomes.
With the 2021 BimlFlex release nearing completion, it's time to take a closer look at the patterns for Mapping Data Flows that will be made available in preview.
Every data solution benefits from a robust control framework for data logistics. One that manages if, how and when individual data logistics processes should be executed. A control framework also provides essential information to complete the audit trail of how data is processed through the system and is ultimately made available to users. BimlFlex provides the BimlCatalog for this.
In Azure, a Mapping Data Flow itself is not an object that can be executed directly. Instead, it needs to be called from an Execute Pipeline. This pipeline can be run, and in turn it will start the data flow. BimlFlex has advanced features to manage this.
BimlFlex output uses Parameters at Mapping Data Flow level to integrate with the BimlCatalog and store metadata for use inside the data flow. This post explains how to set this up in Biml.
BimlStudio can translate Biml into Data Flow Mappings, and this post looks into the deployment and results in Azure Data Factory.
In this first dev diary post we show the basic Biml syntax to create Data Flow Mappings for Azure Data Factory