The Varigence Blog
Tag - adf,
Dev Diary - Initial Extension Points added to Mapping Data Flows
As part of ongoing improvements for our Mapping Data Flows feature, we are adding the first (of many) extension points.
Dev Diary - Embracing the new Azure Data Factory Script Activity
Varigence has moved fast to adopt the new Script Activity feature in Azure Data Factory.
Dev Diary - Customizing file paths for Delta Lake
It is now possible to configure specific directory structures for your Delta Lake using BimlFlex.
Dev Diary - Configuring load windows and filters for Mapping Data Flows
An overview of how to configure BimlFlex Parameters to manage load windows and apply any other filters to the data selections.
Dev Diary - Connecting to unsupported data sources using Mapping Data Flows and BimlFlex
Loading data from unsupported data sources in Mapping Data Flows
Dev Diary - Adding Transformations to Mapping Data Flows
BimlFlex will soon support custom transformations to be generated for Mapping Data Flows.
Dev Diary – Generating a Mapping Data Flow Staging process without a Persistent Staging Area
A real-world data solution often requires a combination of many different data loading patterns. Using BimlFlex, these patterns are applied by configuring the design to work in certain ways. This post shows how to configure BimlFlex to achieve different Staging pattern outcomes.
Dev Diary - First look at a source-to-staging pattern in Mapping Data Flows
With the 2021 BimlFlex release nearing completion, it's time to take a closer look at the patterns for Mapping Data Flows that will be made available in preview.
Dev Diary - Integration with the BimlCatalog
Every data solution benefits from a robust control framework for data logistics. One that manages if, how and when individual data logistics processes should be executed. A control framework also provides essential information to complete the audit trail of how data is processed through the system and is ultimately made available to users. BimlFlex provides the BimlCatalog for this.
Dev Diary – Orchestrating Mapping Data Flows
In Azure, a Mapping Data Flow itself is not an object that can be executed directly. Instead, it needs to be called from an Execute Pipeline. This pipeline can be run, and in turn it will start the data flow. BimlFlex has advanced features to manage this.
Dev Diary – Defining Mapping Data Flow Parameters with Biml
BimlFlex output uses Parameters at Mapping Data Flow level to integrate with the BimlCatalog and store metadata for use inside the data flow. This post explains how to set this up in Biml.
Using built-in logging for Biml
When working with Biml in any situation, be it using BimlExpress, BimlStudio or BimlFlex, it can be helpful to peek into what is happening in the background. Biml provides logging features to do so.
Dev Diary - Deploying Biml-Generated ADF Data Flow Mappings
BimlStudio can translate Biml into Data Flow Mappings, and this post looks into the deployment and results in Azure Data Factory.
Dev Diary - Generating ADF Data Flow Mapping using Biml
In this first dev diary post we show the basic Biml syntax to create Data Flow Mappings for Azure Data Factory
Delta Lake on Azure work in progress – introduction
In this development blog series, we explain how inline data sources and Delta Lake in particular will be supported in BimlFlex