Varigence Blog

Thumbnail

Become a Biml Hero Certified Expert!

Written by Varigence.com on 3.24.2015

Biml Hero Certified Expert

As the Biml ecosystem has continued to grow, several customers, partners, and community members have asked if we could create a formal certification program for Biml. In response to this demand, we have created the Biml Hero Certified Expert Program. Biml Heroes will be certified as experts by the creator of Biml Scott Currie.

The program has been designed to ensure that Biml Heroes can effectively use Biml to create real-world, production quality business intelligence and data warehousing solutions. Unlike some certification programs, you won't earn a certification by just sitting in a class for a few hours. In order to become a Biml Hero, you'll need to attend eight hours of advanced training. Then you'll then have to pass a comprehensive test on Biml as well as submit a production quality code sample and case study detailing how you implemented a data solution with Biml. The code sample and case study will be reviewed in depth by the Biml product team at Varigence. Even those whose solutions are approved will receive detailed one-on-one feedback about how to write even better Biml code and and further improve your skills in the future.

Earning the Biml Hero certification will not be easy; however, it will be well worth the effort. Below are just a few of the many benefits of being a Biml Hero:

  • Biml Heroes will receive exclusive training as part of the certification process.
  • Varigence will refer Biml Heroes to organizations that request assistance with Biml or are looking to hire someone who is proficient in Biml.
  • Biml Heroes will be featured on both the Bimlscript.com and Varigence.com websites. They can add a picture, a short bio, and links to their social networking pages.
  • Biml Heroes can display the Biml Hero badge and Biml Hero Certified expert logo for professional use on their blogs, business cards, social networking sites and other marketing material.
  • Varigence will provide access to online Biml content and communities created exclusively for Biml Heroes.
  • Upon initial certification you will receive a complimentary Biml Hero t-shirt, polo, and jacket.

To learn more about becoming a Biml Hero please download the Biml Hero program guide.

To register and enroll in the Biml Hero training program please contact training@varigence.com.

Thumbnail

Mist 4.0 Released

Written by Varigence Blog on 2.12.2015

Mist 4.0

Varigence is pleased to announce the release of Mist 4.0. Now Biml users and Business Intelligence professionals around the world can develop and automate BI solutions faster than ever before. Read below to see the top ten game changing features of Mist 4.0. Then download a free trial of Mist 4.0 and see for yourself why there is no better IDE in the world for building BI solutions than Mist.

Top Ten New Features of Mist 4.0

  1. Metadata modeling - Makes the use and management of metadata a reality by providing a general purpose mechanism for creating metadata models, enforcing validation rules, and providing metadata entry user interfaces (UIs) for non-technical Subject Matter Experts (SMEs) or Business Analysts (BAs). Mist autogenerates a metadata entry interface for non-technical users based on your metadata model.

  2. Offline schemas - Development teams can develop BI assets in a disconnected state.

  3. Full support for SQL Server 2014 - You can build and deploy your BI solution to SQL Server 2014. SQL Server 2012 and 2014 now also produce ISPAC output files for ease of deployment automation.

  4. SSIS Importer - Improved with support for importing entire projects, ISPAC files, deployed projects from SQL Server, or individual packages. The importer can now be called programmatically which enables automated import, transformation, and re-emission of existing SSIS projects.

  5. True Multi-Display Interface (MDI) – Multiple editing windows can be opened at one time.

  6. Redesigned Ribbon – Improved layout and design for easier access to most used features.

  7. Logical View - Groupings by Relational, SSIS, SSAS, Metadata, Library, Broken or Live BimlScripts. Improved management of transformers, frameworks, and build configurations.

  8. Improved connection designers – no more connection string authoring frustration.

  9. Biml Code Editor - Improved and expanded intelliprompt completion lists.

  10. Code Support – Ability to author C# and VB code files that are included with your BimlScript builds. Extension methods, helper classes, you name it!

Thumbnail

Varigence Australia Announces New Consulting Partner - Hilmax Solutions

Written by Varigence Blog on 10.9.2014

Sydney, Australia - October 9, 2014 - Varigence is pleased to announce a consulting partnership with Hilmax Solutions.

As partners Varigence and Hilmax Solutions are committed to working together to extend the power of existing metadata frameworks which will deliver better business intelligence solutions faster and far more affordably.

Varigence will assist Hilmax in providing training, consulting, implementation services, and support to the Biml and Mist community.

Thumbnail

Varigence and Rehfeld Announce Strategic Partnership to Integrate Mist and Effektor

Written by Varigence Blog on 6.2.2014

MIst + Effektor

Press Release

Greenville, SC, USA and Copenhagen, Denmark – June 2, 2014 - Varigence, the creator of the rapidly growing Biml ecosystem, and Rehfeld, the leading provider of Business Intelligence and Data Warehousing professional services in Scandinavia, are pleased to announce a long-term strategic partnership to bring premier tools for Business Intelligence and Data Warehouse development to the Global market. Under the agreement Varigence’s Mist Integrated Development Environment (IDE) will be used to extend the power and capabilities of Rehfeld’s Self-Service Data Warehousing Platform, Effektor.

“Integrating Mist with Effektor is another great addition to the rapidly growing Biml/Mist ecosystem. By combining the power of Mist with Effektor’s data warehousing capabilities Rehfeld can now offer one of the most advanced BI solutions ever created,” said Scott Currie, Varigence CEO.

Mist is an integrated development environment (IDE) for authoring Business Intelligence Markup Language (Biml) code and data assets. Mist leverages visual design capabilities and debugging features to provide a unified experience for managing entire business intelligence solutions, including relational modeling, data transformation packages, and OLAP cubes. Mist also incorporates modern programming IDE features, including text editors with syntax highlighting, Intelliprompt and quick-info displays, source control integration, and multi-monitor support. Combined with the capabilities of Biml to automatically generate huge amounts of data assets from metadata, reuse code, enforce patterns and best practices, Mist changes the economics of building business intelligence and data warehouse solutions.

Effektor is a meta-data driven platform for DW/BI-solutions supporting all relevant needs in BI from importing source system data over data warehousing and Master Data Management to reporting both on relationel data and OLAP cubes. The platform is built around the concepts of ease of use and self service: working with your data warehouse changes to structure and/or data should not require deep technical expertise and as many tasks as possible should be done by data stewards and end users, not IT pros. Effektor also offers self service functionality on row based security settings and Master data management.

“Since its initial release our customers have been implementing cutting edge BI solutions using Effektor in a fraction of the time and cost of hand build solutions but with a high quality data warehouse architecture. Now with the integration of Mist, our Effektor customers can gain even more insight into their data enabling them to make more profitable data driven decisions for their company,” said Ditlev Moltke, COO of Rehfeld Partners.

Rehfeld has successfully completed the TAP-program for the integration of Mist with Effektor. Version 6.2 of Effektor will ship on November 1, 2014 and include Mist integration and extended data warehousing features.

For press inquiries, please contact media@varigence.com or media@rfeld.com. For customer inquiries, please contact sales@varigence.com or sales@rfeld.com.

Thumbnail

Varigence Australia Announces New Consulting Partner - Antares Solutions

Written by Varigence Blog on 5.7.2014

Varigence Australia Antares Solutions

Press Release

Sydney, Australia - May 7, 2014 - Varigence Australia is pleased to announce that Antares Solutions have become Consulting Partners under a formal agreement.

Under the agreement Varigence Australia will support Antares Solutions in providing training, consulting, implementation, and first line support in Biml and Mist for Antares clients. Through this partnership Varigence will work with Antares to deliver advanced Business Intelligence (BI) solutions.

“By combining our cutting edge business intelligence technologies with the implementation and consulting expertise of Antares Solutions, we will be able to provide incredible value and service to the end user,” says Paul Schmidt, Varigence Australia General Manager.

“This is a good fit for us - Varigence Australia’s approach is to reduce complexity and cost, but still deliver optimal results with outstanding technology, people and tools. This is also the Antares way.” says Simon Cheadle, Antares Solutions Chief Executive Officer.

Mist is an integrated development environment (IDE) for authoring Business Intelligence Markup Language (Biml) code and data assets. Mist leverages visual design capabilities and debugging features to provide a unified experience for managing entire business intelligence solutions, including relational modelling, data transformation packages, and OLAP cubes. Mist also incorporates modern programming IDE features, including text editors with syntax highlighting, Intelliprompt and quick-info displays, source control integration, and multi-monitor support. Combined with the capabilities of Biml to automatically generate huge amounts of data assets from metadata, reuse code, enforce patterns and best practices, Mist changes the economics of building business intelligence and data warehouse solutions.

Julio Acuna, the Head of Data Solutions at Antares believes that “the breadth of Antares’ solutions offering is greatly improved by our partnership with Varigence. We can now approach traditional problems related to Data Warehousing and Business Intelligence implementation in a more agile and consistent way. With these tools, the winners are with no doubt are our customers.”

Varigence is the creator of Business Intelligence Markup Language (Biml), and has been building BI accelerator tools since 2008. By employing a mix of deep experience in BI solutions, Varigence applies the best innovations from application development to enterprise Business Intelligence, whilst offering an unprecedented level of accessibility to the end-user.

For media inquiries, please contact media@varigence.com.au or Rowland Parr (Antares Solutions) 02 8275 8811

For partnership inquiries, please contact partners@varigence.com.au

For customer inquiries, please contact sales@varigence.com.au or info@antaressolutions.com.au

Thumbnail

Biml Scandinavian Tour

Written by Varigence.com on 4.5.2014

Biml Scandanavian Tour

Due to the rapid adoption of Biml and Mist throughout Scandinavia, Rehfeld and Varigence are proud to present a 10 day speaking tour on Biml. From May 5 until May 15th the tour will travel through six cities in four different countries. The highlight of the tour is four full day Biml Master Classes in Copenhagen, Stockholm, Oslo, and Helsinki. The Master classes will be taught by the creator of Biml Scott Currie and Rehfeld Principal BI Architect Peter Hansen. Scott and Peter will work together to take your Biml skills to the next level. Click on the individual events below to reserve your spot today.

Denmark

Sweden

Norway

Finland

If you have any questions about the tour please email Finn Simonsen at fas@rehfeld.dk or call +45 3170 1059.

Thumbnail

Varigence Australia Launches

Written by Varigence.com on 4.1.2014

Varigence Australia

Press Release

Greenville, SC, USA and Sydney, Australia – March 31, 2014 - Varigence Inc, developers of the leading business intelligence tools Biml and Mist are pleased to announce the formation of Varigence Australia. The Varigence suite of products are changing the economics for Microsoft BI development as it provides powerful tools to reduce BI delivery costs and better leverage a business’s metadata.

Adoption of the Varigence technologies has been rapidly expanding in the US and Europe and now through the formation of Varigence Australia, these premier tools for business intelligence and data warehouse development are far more accessible to businesses in Australia and New Zealand.

“Australia and New Zealand have been among our fastest growing markets over the past year. Varigence Australia will enable us to better serve our current partners and customers, while helping to continue the growth that the Biml ecosystem has enjoyed in Oceania.” Scott Currie, CEO, Varigence.

Varigence Australia will provide sales, support and marketing services for all Varigence customers and partners in Australia and New Zealand. By working with partners and independent consultants they will be providing implementation training and first line support to assist other BI professional services to create more value for their clients. Biml and Mist bring the level of innovation to business intelligence that the tablet brought to the PC through end-user accessibility and more cost effective and faster BI delivery.

“We believe that Varigence Australia’s BI tools will deliver better business intelligence faster and far more affordable than ever before to Australian and New Zealand businesses. Mist and Biml will provide access to a higher level of business intelligence with greater agility for businesses in our region, at a time when competitive edge means everything” Paul Schmidt, General Manager – Varigence Australia

To celebrate their launch Varigence Australia is a platinum sponsor of the PASS SQL Saturday in Melbourne on 5th April 2014.

About Varigence Australia
Varigence Australia will provide sales, support and marketing services for all Varigence customers and partners in Australia and New Zealand. They will partner with BI professional services and independent consultants to introduce Varigences suite of products. Varigence Australia will provide implementation training and first line support for Mist, Biml and all other Varigence BI tools. The management team consist of Paul Schmidt (General Manager) and Peter Avenant (Director of Technology) formerly from Avelytics Pty Ltd. www.varigence.com.au

About Varigence
Varigence brings the best innovations in business intelligence architecture to market through application development and by delivering enterprise business intelligence faster, easier and in a much more economical framework. They are the creator of Business Intelligence Markup Language (Biml) and have been building business intelligence accelerator tools since 2008. Through innovative easy to use tools, Varigence products offer an unprecedented level of business intelligence accessibility to the end-user and have delivered state-of-the-art business intelligence solutions to companies of all sizes, from internet start-ups through to Fortune 100 companies.

About Scott Currie
Scott Currie is the founder and CEO of Varigence, Inc. Scott is also the creator of Business Intelligence Markup Language (Biml). Prior to founding Varigence, Scott worked for 7 years at the Microsoft Corporation.

About Biml/Mist
Mist is an integrated development environment (IDE) for authoring Business Intelligence Mark-up Language (Biml) code and data assets. Mist leverages visual design capabilities and debugging features to provide a unified experience for managing entire business intelligence solutions, including relational modelling, data transformation packages, and OLAP cubes. Mist also incorporates modern programming IDE features, including text editors with syntax highlighting, Intelliprompt and quick-info displays, source control integration, and multi-monitor support. Combined with the capabilities of Biml to automatically generate huge amounts of data assets from metadata, reuse code, enforce patterns and best practices, Mist changes the economics of building business intelligence and data warehouse solutions.

About Pass SQL Saturday
SQLSaturday is a training event for SQL Server professionals and those wanting to learn about SQL Server. Sessions will be presented by some of the best SQL Server trainers, consultants and professionals in the region. The Professional Association for SQL Server (PASS) was co-founded by CA Technologies and Microsoft Corporation in 1999 to promote and educate SQL Server users around the world. Since its founding, PASS has expanded globally and diversified its membership to embrace professionals using any Microsoft data technology. With a growing membership of more than 100K, PASS supports data professionals throughout the world who use the Microsoft data platform. http://sqlsaturday.com/296/eventhome.aspx


For press inquiries, please contact media@varigence.com.au or media@varigence.com

For partnership inquiries, please contact partners@varigence.com.au

For customer inquiries, please contact sales@varigence.com.au

Thumbnail

Varigence and Rehfeld Announce Strategic Partnership

Written by Varigence.com on 3.13.2014

Varigence + Rehfeld

We are pleased to announce a long-term strategic partnership with Rehfeld, the leading provider of Business Intelligence and Data Warehousing professional services in Scandinavia, to bring the premier tools for Business Intelligence and Data Warehouse development to the Scandinavian market.

Under the terms of the partnership Rehfeld will be the exclusive reseller of Varigence’s Mist IDE in Denmark, Finland, Iceland, Norway and Sweden. Rehfeld will also provide training, consulting, implementation, and first line support for Biml and Mist customers in the region. Rehfeld is also committed to working with other professional services firms throughout Scandinavia to assist them in creating more value for their clients with Biml and Mist.

“By combining our cutting edge business intelligence technologies with the implementation and consulting expertise of Rehfeld we will be able to provide unparalleled value and service to data professionals throughout Scandinavia,” said Scott Currie, Varigence CEO.

Mist is an integrated development environment (IDE) for authoring Business Intelligence Markup Language (Biml) code and data assets. Mist leverages visual design capabilities and debugging features to provide a unified experience for managing entire business intelligence solutions, including relational modeling, data transformation packages, and OLAP cubes. Mist also incorporates modern programming IDE features, including text editors with syntax highlighting, Intelliprompt and quick-info displays, source control integration, and multi-monitor support. Combined with the capabilities of Biml to automatically generate huge amounts of data assets from metadata, reuse code, enforce patterns and best practices, Mist changes the economics of building business intelligence and data warehouse solutions.

“At Rehfeld, we’ve already been using Biml and Mist in our client engagements with fantastic results. These tools are truly must-have for any firm who relies on data to make better decisions. By adopting Mist in our projects we can offer our customers greater agility, higher quality and increased productivity. We’re very excited to help lead even greater adoption throughout Scandinavia,” said Ditlev Moltke, COO of Rehfeld Partners.

For press inquiries, please contact media@varigence.com or media@rfeld.com.

For partnership inquiries, please contact partners@varigence.com.

For customer inquiries, please contact sales@varigence.com or sales@rfeld.com.

Thumbnail

Biml at PASS Summit 2013

Written by Varigence.com on 11.7.2013

Biml had a great showing and according to this blogger was a star of PASS Summit 2013.

On Tuesday, October 15th, Varigence and Linchpin People presented the first ever full day Biml training event. The Workshop was held at the Wake Forest University Charlotte Center and sold out with 50 plus people in attendance. Presenters included Andy Leonard and Tim Mitchell as well as the creator of Biml Scott Currie.

Biml Workshop

At the Summit itself, Biml was an integral part of at least six presentations. These presentations included:

Reduce, Reuse, Recycle Presentation

Thumbnail

Mist 3.4 Released

Written by Varigence Blog on 11.6.2013

Mist 3.4

Varigence is pleased to announce the release of Mist 3.4, Mist is the only IDE made specifically for Business Intelligence Markup Language (Biml). Released less than two months after Mist 3.3, Mist 3.4 is another successful short release cycle that delivers powerful new features. Get a free 14 day trial of Mist at http://www.varigence.com/Products/FreeTrial.

Mist 3.4 New Features
  • Mist UI Changes
    • Added a forward/back button in the quick access toolbar. Forward and back can also be accessed with Alt-Left/Alt-Right, mouse buttons, or touch gestures. History is maintained for 15 items
      by default.
    • Added a search box to Project View which enables search/filtering by file name
    • Fix for missing root folders on project load
    • Added context menu commands for new sql generation extension methods (see below)
    • Copy Biml context menu item now on all objects. This will give you the raw Biml for any object, even if it is live scripted.
    • Changed the default on logical view to remove empty folders/groups when a search is active
    • Fixed a Mist crash when broken transformers are run from context menu.
    • Improvements to licensing and activation – especially in offline scenarios.
    • Added “View in BimlScript designer” to all logical view root items, which enables you to more easily see expanded Biml for live scripted objects
  • Package Editor
    • Entirely new Compact Mode that can be toggled with a new ribbon button and which is saved as a project setting.
    • ~20% performance improvement in Package Editor load time for big packages.
  • BimlEditor Changes
    • Added highlighting of matching delimiters (XML tags, code nuggets, multiline comments, braces, parentheses, etc.)-
    • Added highlighting of other instances of the currently selected text
    • Fixed intelliprompt to work better with code in included files
    • Added indentation guides, which are on by default.
  • Language and Compiler Changes
    • Added SurrogateKey as an option for table column ScdType.
    • File group modeling is now available on databases. Tables and indexes can be assigned to file groups explicitly
    • FilterPredicate has been added to AstIndexNode to support filtered indexes
    • Added DefaultConstraintName to AstTableColumnBaseNode to support naming the default constraint. This is especially useful for source controlled DBs
    • Added new properties for Package emission (VersionMajor, VersionMinor, VersionBuild, CreatorName, CreatorComputerName, CreatedDate).
    • Added support for DataflowOverrides, which provide functionality for changing much of the default options that the compiler sets on output paths and columns.
    • Fix to enable Text and NText in FlatFileFormats
    • Fixed custom component emitter bug reported in forums.
    • Fixed bug with custom log provider emission
  • Extension Methods
    • Added extension methods to string for XmlEscape and XmlUnescape
    • Added GetColumnCompareList and GetColumnAssignmentList methods on table for easier construction of complex sql queries
    • New extension methods to get CreateAndDropDdl for relational object types
    • New extension methods to get SELECT, INSERT, UPDATE, and DELETE query templates for tables
  • Package Importer
    • Importer enhanced to use DataflowOverrides
    • Added support to importer to disable import of SSIS descriptions as annotations. These are now on by default.
    • Lookup parameterized queries now import only if cache mode != full
Thumbnail

Upcoming Biml Presentations

Written by Varigence Blog on 11.1.2013

There will be several Biml presentations between now and the end of 2013. If you have a presentation focusing on Biml that's not listed, please let us know on the comments below or post it at the Biml User Group on Linkedin.

Upcoming Biml Presentations

Thumbnail

Mist 3.3 Released

Written by Varigence Blog on 9.3.2013

Mist 3.3

NEW BACKSTAGE SPLASH SCREEN

Varigence is pleased to announce the release of Mist 3.3. Mist is the only IDE made specifically for Business Intelligence Markup Language (Biml). Mist 3.3 has several major improvements including a backstage splash screen, improved build performance, and Attunity Oracle connection support. To see the full release notes for Mist 3.3 please click here.

Mist allows you to model and generate SSIS packages, SSAS cubes, and then leverage Mist designers for creating and modifying your BI assets. You can also import already made SSIS packages into Mist, allowing you to continue their development directly with Biml.

Further, Mist incorporates BimlScript, which lets you codify Biml creation with reusable templates and transformers. BimlScript makes it trivial to apply changes across several BI assets and enforce design patterns.

Mist is available with a perpetual or subscription license. Subscription licenses are renewed every 30 days with no commitment required. Free 14 day trials are available at Varigence.com

Thumbnail

Stairway to Biml

Written by Varigence Blog on 7.18.2013

Stairway to Biml

Due to Biml's growing popularity among BI professionals, SQL ServerCentral.com has launched a stairway series on Biml. Andy Leonard from Linchpin People and and Paul Waters from Varigence will be writing the "Stairway to Biml" articles.

SQL Server Central's Stairways are designed to get a beginner up and running on SQL Server based technologies. Each Stairway is a SQL tutorial series focused on a single topic and is arranged into no more than a dozen easily-navigable tutorials. Each level is the length of a typical magazine tutorial, and emphasizes practical, hands-on learning, with just enough background theory to help you understand the topic at a deeper level.

Level One of the staircase entitled, "What is Biml" was released earlier today and has already been read over 8000 times. Levels Two through Four of the Stairway are scheduled to be released as follows:

  • Level 2 - Wednesday, July 31, 2013
  • Level 3 - Wednesday, August 7, 2013
  • Level 4 - Wednesday, August 14, 2013

Levels five through twelve will be released over the remainder 2013.

Thumbnail

Biml Workshop 2013

Written by Varigence Blog on 6.13.2013

On Tuesday, October 15th in Charlotte, NC, Varigence and Linchpin People will be presenting a full day Biml workshop. This workshop will be held at the Wake Forest School of Business located in Uptown Charlotte just two blocks from the Charlotte Convention Center. The Wake Forest Center has state of the art classrooms with large display screens as well as full wi-fi access and power outlets for all attendees.

Wake Forest Entrance Wake Forest Classroom

The workshop will have fully catered meals throughout the day. There will be a hot breakfast bar, Carolina style BBQ lunch and refreshments and snacks served throughout the day. All attendees will also receive a Biml workshop t-shirt .

Breakfast Brownies

Registration is only $69. Register before July 15th, and receive the early bird price of just $49. Click here to register.

More information and speaker announcements will be released on Twitter and the Biml Users Group on Linkedin. To learn more about Biml go to BimlScript.com.

Biml Workshop T-shirt

Thumbnail

All New Varigence.com

Written by Varigence Blog on 4.3.2013

Varigence is proud to announce the relaunch of Varigence.com. Along with a new user friendly design, Varigence.com has added tons of new product information, documentation and videos.

Some of our new features include:

  • FAQ - Find answers to commonly asked questions about Biml, Mist and Vivid.
  • Forums - Have improved code block formatting posting Biml
  • Product - New product pages highlighting the capabilities and features of Biml, Mist and Vivid.
  • Search - Enhanced search functionality throughout the website.
  • Support - All Biml language documentation, product user guides and samples fully updated.
  • Training - New training pages highlighting our accelerator courses
  • ​Videos - New product and feature videos throughout the website

To see more of the all new Varigence.com click below:

Varigence Homepage

Thumbnail

Mist 3.2 Released

Written by Varigence Blog on 3.25.2013

Download Mist v3.2 Now

IMPROVED PACKAGE DESIGNER

Varigence is pleased to announce Mist 3.2. This release has a modernized user interface and improved package designer.

With Mist you can model and generate SSIS packages, SSAS cubes, and other BI assets using Business Intelligence Markup Language (Biml). Or leverage Mist designers for creating and modifying your BI assets. You can also import already made SSIS packages into Mist, allowing you to continue their development directly with Biml.

Further, Mist incorporates BimlScript, which lets you codify Biml creation with reusable templates and transformers. BimlScript makes it trivial to apply changes across several BI assets and enforce design patterns.

Mist is available with a perpetual or subscription license. Subscription licenses are renewed every 30 days with no commitment required.

New features in Mist 3.2 include:

  • UI Improvements
    • Modernized User Interface
      • Package Designer Details
      • Modernized Package Designer UI
      • Improved package dashboard
      • Package surface can scroll during drag and drop
      • When adding a task to the designer, vertical padding is added beneath it
    • Added visual notification to the Windows task bar when a build finishes
    • Added context menus to the Configuration designer
    • Project View multi-select works with shift+click
  • SSIS 2012
    • Fixed emission issues with package and project parameters
    • Fixed emission issues with log events
    • Proper emission of project connections with the Execute Package task
    • Native support for Expression task
    • Added protection level to package projects
    • Added package parameters to Expression Builder
  • Package Import
    • Import without IDs is now the default behavior
    • The following now import correctly
      • Log Events in SSIS 2012
      • Output paths for packages with script components
      • Packages with package parameters
      • Packages with connection names containing a backslash
      • Script tasks and components that use VB
  • Logical View
    • Execute Transformers context menu now sorts transformers alphabetically
    • Execute Transformers context menu now displays transformers that start with an underscore
    • Fixed duplication error when duplicating a package that references a script task or component
  • Project Designer
    • Use Project Deployment is only enabled when targeting SSIS 2012
  • Biml
    • Added ValidationReporter.Report(IFrameworkItem, Severity, Message, Recommendation) overload
    • Several improvements to error and validation reporting
  • Setup
    • Streamlined installer
    • Eliminated the SQL Server prerequisites from Mist installations
For more details on Mist, please visit:

http://www.varigence.com/mist

Download a 14-day free trial of Mist 3.1 at:

http://www.varigence.com/FreeTrial

For information on Mist pricing and licenses, please visit:

https://www.varigence.com/store/

Thumbnail

Varigence featured in Greenville Business Magazine

Written by Varigence Blog on 2.1.2013

Varigence is featured in the February issue of Greenville Business Magazine. The article is on page 61 and can be viewed on the magazine's online edition.

image

Varigence’s Steven and Scott Currie

Thumbnail

Mist 3.1 Released

Written by Varigence Blog on 12.15.2012

image

IMPROVED USER EXPERIENCE

Varigence is pleased to announce Mist 3.1, a release focused on improving user experience within our IDE for Business Intelligence development.

Use Mist to model and generate SSIS packages, SSAS cubes, and other BI assets using Business Intelligence Markup Language (Biml). Or leverage Mist designers for creating and modifying your BI assets. You can also import already made SSIS packages into Mist, allowing you to continue their development directly with Biml.

Further, Mist incorporates BimlScript, which lets you codify Biml creation with reusable templates and transformers. BimlScript makes it trivial to apply changes across several BI assets and enforce design patterns.

New features in Mist 3.1 include:
  • Comprehensive context menus throughout the UI
  • New connection string builder that allows for the easy creation of connection strings within Mist
  • Ability to execute BimlScript transformers directly from context menus
  • The ability to customize your toolbox to your individual preferences

Remember, Mist is now available with a perpetual or subscription license. Subscription licenses are renewed every 30 days with no commitment required.

For more details on Mist, please visit:

http://www.varigence.com/mist

Download a 14-day free trial of Mist 3.1 at:

http://www.varigence.com/FreeTrial

For information on Mist pricing and licenses, please visit:

https://www.varigence.com/store/

Thumbnail

Introducing Mist 3.0

Written by Varigence Blog on 11.8.2012

alt text here…

Now With Zero-Commitment

Monthly Subscriptions!

Varigence is proud to announce Mist 3.0, a major update to our IDE for Business Intelligence development.

Use Mist to model and generate SSIS packages, SSAS cubes, and other BI assets using Business Intelligence Markup Language Language (Biml). Alternatively, you can leverage Mist designers for creating and modifying your BI assets. Further, Mist can import already made SSIS packages, allowing you to continue their development directly with Biml.

Mist also incorporates BimlScript, which lets you codify Biml creation with reusable templates and transformers. BimlScript makes it trivial to apply changes across several BI assets and enforce design patterns.

New features in Mist 3.0 include:
  • Full support for SSIS 2012.
  • A redesigned Project View that will be familiar to Visual Studio and BIDS users.
  • Biml assets are now built with MSBuild, enabling custom build scenarios.
  • You can now purchase a perpetual or subscription license. Subscription licenses are renewed every 30 days with no long term commitment required.
For more details on Mist, please visit:

http://www.varigence.com/mist

Download a 14-day free trial of Mist 3.0 at:

http://www.varigence.com/free-trial

For information on Mist pricing and licenses, please visit

https://www.varigence.com/store/

Thumbnail

Biml is on Wikipedia

Written by Varigence Blog on 11.6.2012

Biml's popularity is continuing to grow with more business intelligence professionals using it everyday. Due to Biml being increasingly mentioned in technical books and blogs it has garnered its own article on Wikipedia. Check it out at Wikipedia - Biml Article.

https://varigencecomstaging.blob.core.windows.net/blogimages/BimlInfobox.png

Thumbnail

Varigence Sponsors SQL Saturday #174

Written by Varigence Blog on 10.29.2012

On Saturday, October 27th, Varigence sponsored SQL Saturday #174 in Charlotte. This SQL Saturday was a BI Edition and had several Microsoft MVPs and industry experts speaking on various Business Intelligence topics.

Paul Waters and Steven Currie gave demonstrations of Biml and Mist to attendees throughout the day. Paul also gave an hour long presentation on Biml entitled, "Automating SSIS Development with Biml." Once again Paul presented to a standing room only crowd that loved learning about Biml.

Varigence enjoyed sponsoring SQL Saturday #174 and thanks all the attendees, sponsors, speakers and organizers for this great event.

https://varigencecomstaging.blob.core.windows.net/blogimages/sqlsat174-booth.jpg

Steven Currie and Paul Waters at Varigence Booth.

https://varigencecomstaging.blob.core.windows.net/blogimages/sqlsat174-pres.jpg

Paul Waters presenting, "Automating SSIS Development with Biml."

Thumbnail

SQL Saturday #163 - Biml Presentation Standing Room Only

Written by Varigence Blog on 10.22.2012

On Saturday, October 13th, Paul Waters presented, "Automating SSIS Development with Biml" at SQl Saturday #163 in Dallas, Texas. The presentation was standing room only and the audience loved learning about Biml.

This SQL Saturday was a BI Edition and had several Microsoft MVPs and industry experts speaking on various Business Intelligence topics.

https://varigencecomstaging.blob.core.windows.net/blogimages/n64zta.jpg

Photo courtesy of Ryan Adams

Thumbnail

Biml Presentations At Two SQL Saturdays in October

Written by Varigence Blog on 10.5.2012

Biml Presentations At Two SQL Saturdays in October

In these presentations Paul Waters will start by showing how Biml can be used to solve common BI problems. Then after discussing the basics of BimlScript he will finish by covering practical examples using the BIDS Helper add-in to automate SSIS development.

Both of these SQL Saturdays are focused on BI and will have several Microsoft MVPs and industry experts speaking on various Business Intelligence topics. SQL Saturday #163 is on October 13th and SQL Saturday #174 is on October 27th. To learn more about these events follow the links below:

https://varigencecomstaging.blob.core.windows.net/blogimages/eventhome.png

https://varigencecomstaging.blob.core.windows.net/blogimages/eventhome2.png

Thumbnail

BimlScript.com Launches

Written by Varigence Blog on 9.19.2012

Due to the increased worldwide use of the Business Intelligence Markup Language (BIML). A new website was launched last week for users of Biml. BimlScript.com is a community based website that allows users to contribute code snippets, walkthroughs and videos on Biml. The site already has several contributors and dozens of content posts that Biml users can reference to find solutions to common Business Intelligence problems.

Anyone who wants to learn how to code in Biml, ask questions on a hard to solve problem or just get see what other Biml users are doing should check it out.

https://varigencecomstaging.blob.core.windows.net/blogimages/BimlScript.com.png

Thumbnail

Great New SSIS Book talks about BIML

Written by Varigence Blog on 9.10.2012

Varigence just received a copy of SQL Server 2012 Integration Services Design Patterns.

This book was written by SSIS gurus Andy Leonard, Matt Masson, Tim Mitchell, Jessica Moss and Michelle Ulford.

Design Patterns devotes an entire chapter to the revolutionary Business Intelligence Markup Language (Biml), which was created by Varigence. Biml has allowed BI developers worldwide to eliminate the repetition in generating SSIS packages, and do in hours what previously took days.

This book is a terrific resource for learning Biml. The Biml chapter begins with a brief history and then explains step by step, with several screen shots, how to:

  • Build your first Biml file inside of BIDS Helper
  • Build a pattern for Basic Incremental Load in a SSIS package
  • Use Biml as a SSIS Design Patterns Engine

In addition to the Biml chapter, Design Patterns covers 19 additional topics related to SQL Server 2012 that will help take your SSIS skills to the next level. SQL Server 2012 Integration Services Design Patterns is available through Amazon or Apress.

https://varigencecomstaging.blob.core.windows.net/blogimages/5119BOmw-OL._SL500_AA300_.jpg

Thumbnail

Varigence Is Golden At SQLBits X

Written by Varigence Blog on 2.4.2012

Varigence is excited to announce that we will be a Gold Sponsor of SQLBits X.

SQLBits X consists of 3 days of amazing SQL Server content and serves as the official UK Technical Launch for SQL Server 2012. SQLBits will be taking place at Novotel West in London from March 29th until March 31st.

Varigence will be exhibiting on site at SQLBits, so stop by our booth and say hello. We’ll be giving away mini-footballs and having a raffle for an IPad 2. In addition, our CEO Scott Currie will be giving two lunch time presentations. On Friday, Scott will discuss how to accelerate your BI projects with Varigence tools and frameworks. On Saturday, he’ll show how you can use a free solution with Business Intelligence Markup Language (BIML) and BIDS Helper to achieve unparalleled performance and usability from SSIS.

Register for SQL Bits now at https://varigencecomstaging.blob.core.windows.net/blogimages/www.sqlbits.png

If you’d like talk with someone from Varigence before SQLBits to discuss your organization’s business intelligence needs, we’d love to meet with you. We’ll be available to demonstrate our products and services to anyone in the London area on March 27, 28 and 29th. Contact us now to set up an individualized consultation for your organization.

Cheers,

Varigence

Thumbnail

Results of the $10,000 Biml Challenge

Written by Kang Su Gatlin on 10.14.2011

https://varigencecomstaging.blob.core.windows.net/blogimages/bimlchallengebanner.png

The $10,000 Biml Challenge took place over the last two days (10/12 & 10/13) on the Exhibition Hall floor at SQL PASS Summit 2011.

After compiling the results and conducting the drawing, we now have the winners.

The winners of the $500 drawing are:

Michael Wells & Jandee Richards

The Winner of The $10,000 Biml Challenge - taking home the $10,000 prize is:

David Darden

David Darden blazed through the Challenge in a mere 6 minutes and 39 seconds. Congratulations to David, Michael, and Jandee!

While the $10,000 Biml Challenge is now over you can always download Mist and Biml at http://www.varigence.com/Products/Home/FreeTrial.

Shortly we'll post the actual contest instructions so you can see what users did during the Biml Challenge.

Thumbnail

A (SSAS) Maestro Among Us

Written by Kang Su Gatlin on 9.22.2011

Microsoft has recently started a new program called SSAS Maestros. An extremely deep technical training course on Analysis Services, which includes labs and even a take home exam project.

There were hundreds of applicants and eighty-one attendees. From that pool Microsoft recently announced their first class of 11 SSAS Maestros, and I'm happy to congratulate my colleague John Welch for being part of this elite group!

I work with John, so I can honestly say I'm not surprised to hear about this, but nevertheless it's a great honor for him. Way to go John, and congrats to the other ten Maestros!

Thumbnail

Biml: A Primer for the $10,000 Biml Challenge

Written by Varigence Blog on 9.15.2011

This is the first blog entry focused on helping users prepare for the $10,000 Biml Challenge. This will get you a basic understanding of the BI markup language, Biml.

Biml is an XML-based language for specifying the full set of assets in a BI and/or DW project. With Biml you can design everything from relational databases, to data integration and transformation packages, and OLAP cubes. The Biml language is fully specified online. The rest of the blog entry will give a more informal look into how to use Biml.

Biml works in a way consistent with other XML-based languages. If you want to define something in Biml, specify the element for that item. For example,

<Package Name="SamplePackage">
...
</Package>

This creates a new package with the name SamplePackage. If you want to add something to this package, for example a dataflow task, you just add it to that element. This again is consistent with typical XML-based languages. For example,

<Package Name="SamplePackage">
   <Tasks>                
      <Dataflow Name="SampleDataflow">
...
      </Dataflow>
   </Tasks>
</Package>

Notice that since there can be many tasks in a package, there is a "Tasks" element that acts as a container where we can place multiple tasks.

Using XML attributes you can specify characteristics for the elements. For example, if you're specifying a Lookup element in Biml, you can use attributes on the Lookup to specify things like the connection and the cache mode:

<Lookup Name="Lookup Transformation" OleDbConnectionName="SportsData"
NoMatchBehavior="IgnoreFailure" CacheMode="Partial">

There is a Biml snippets library online where you can see short illustrative snippets for creating different types of BI assets -- everything from OLEDB connections to the Pivot transformation. Biml by itself is very powerful and useful, but there is a feature called BimlScript that allows users to automatically script and generate Biml code. BimlScript will be covered in more detail in a future contest blog entry.

To execute your Biml code you must compile it. The compiler for Biml is Hadron.exe. The output of the compiler varies depending on what your Biml code represents, but for the contest, we'll be focused on Packages, which will compile to SSIS DTSX packages. In practice, you'll rarely use Hadron.exe directly, but rather will build your packages with Mist, the graphical IDE for Biml.

In our next blog entry we'll learn about and use Mist, and walkthrough the process of building a very simple DTSX package.

Thumbnail

The $10,000 Biml Challenge at PASS Summit 2011

Written by Varigence Blog on 9.11.2011

When you have a technology that drastically increases the productivity of BI and Data Warehouse developers and architects, it can be difficult to convey just how groundbreaking it is. We believe Mist & Biml are such products. Once you have the chance to use them, you'll never want to go back to the old way of building your data solutions.

This year at PASS Summit (October 2011, Seattle, WA), we're putting our money where our mouth is by sponsoring the $10,000 Biml Challenge during the exhibition.

The contestant who solves a typical BI/DW task assigned in the competition the fastest will win $10,000 cash. This task would normally require hours or longer with traditional approaches, but with Biml and Mist, the time limit will be a generous 15 minutes.

So what do you need to do to win the $10,000?

  1. Register and attend PASS Summit 2011. This is where the challenge will be held. You must be an attendee or exhibitor to participate.
  2. Visit our website and reserve your slot in the competition today! We'll provide a workstation in a quiet area, so space is limited.
  3. Read this blog. As the PASS Summit approaches, we'll post a variety of resources that will help you prepare, including a practice challenge.
  4. Download Mist/Biml. Get familiar with the tools today. In the near future we will announce and release the official "Challenge" version of Mist/Biml, so you can practice with the exact bits that will be used in the Challenge. But until then the current version is very similar to the official Challenge version.
  5. Read the official rules.
  6. Read The $10,000 Biml Challenge page for the latest information about the challenge. Be sure to reserve your slot in the competition today! Spaces are limited. You've probably never had a better opportunity to walk away with $10,000 at a conference before -- and at the same time you can raise your professional IQ by becoming (more) acquainted with some breakthrough technology. And, just to make things more interesting, everyone who completes the competition, or the online practice challenge, will be entered in a drawing to win one of two $500 prizes. No purchase necessary. Void where prohibited. See official rules for details.
Thumbnail

How To - Tasks and Transformations: FTP Task

Written by Varigence Blog on 8.12.2011

It’s been a little while since my previous post as we’ve been heads down on our next Mist and Hadron releases. However, I’m back today to begin discussing the several Data Preparation SSIS tasks. I’ve already authored a post on the SSIS File System task and I’m continuing today with a discussion of the FTP task.

Background:

The SSIS FTP task allows you to perform various FTP operations in your package. The task implements eight different operations, although three are duplicated by the File System task:

Operation Description Duplicated by File System task Send files Sends a file from the local computer to the specified FTP server Receive files Downloads a file from a FTP server to the local computer Create local directory Creates a folder on the local computer X Create remote directory Creates a folder on a FTP server Remove local directory Deletes a folder on the local computer X Remove remote directory Deletes a folder on a FTP server Delete local files Deletes a file on the local computer X Delete remote files Deletes a file on a FTP server The FTP task can be useful in any workflow that needs to apply transformations to data, before or after moving the data to a different location. In this post, I’ll walkthrough using the SSIS FTP task in two examples. I’ll then demonstrate the same examples in Mist.

Using BIDS:

To get started with the FTP task, first find it in the Toolbox tool window.

alt text here…

Next, drag and drop the task onto the package’s design surface.

alt text here…

The red X icon indicates the task has errors. The Error List displays the following:

alt text here…

From the error list, it’s clear the task requires a connection. This makes sense considering the task needs to know how to connect to the FTP server.

To solve that problem, right click in the Connection Managers area and select New Connection… in the context menu.

alt text here…

This opens the Add SSIS Connection Manager dialog. Select the FTP connection from the list and press Add… to open another dialog for the FTP Connection Manager editor.

alt text here…

This is where you enter your FTP server settings, including the server name and port. Regarding credentials, the FTP connection manager supports anonymous authentication and basic authentication, but not Windows authentication.

Once you’ve filled in the dialog, be sure to test your connection to confirm it works. Then press OK to dismiss the dialog.

Next, double click on the FTP task to open its editor dialog.

alt text here…

In the General tab, there’s a conspicuous blank to the right of the FtpConnection property. Selecting the FtpConnection property displays a dropdown button. Click on the dropdown button to select the just created FTP Connection Manager.

alt text here…

Beneath the FtpConnection property is the StopOnFailure property. Its value indicates if the FTP task stops if an FTP operation fails.

Switching to the File Transfer group, the default operation for the task is Send Files. The send files operation transmits files from your local disk to the specified FTP server.

alt text here…

The Local Parameters group provides two properties for controlling the local file. The LocalPath property contains a string of the file name being sent. The path string can be entered manually, or obtained by creating a File Connection manager in the property’s dropdown list.

The IsLocalPathVariable indicates whether the LocalPath is set in a variable, as opposed to a string. If it’s set to true, the LocalPath property name changes to LocalVariable and its value is set by a variable list dropdown.

In the Remote Parameters group, there are three properties for controlling how the file is received on the FTP server. IsRemotePathVariable functions just like IsLocalPathVariable. If it’s set to false, the RemotePath property holds a string whose value is a path on the FTP server where the files are to be sent. If it’s true, then RemotePath points to a variable that holds the value. The OverwriteFileAtDest property indicates whether a file with the same name as a sent file can be overwritten in the RemotePath directory.

Along with the Operation property, the Operation group contains the IsTransferAscii property. This property controls whether a FTP transfer should occur in ASCII mode.

For the other FTP operations, you’ll see that Local Parameters properties appear for local operations (e.g. create local directory) and Remote Parameters properties appear for remote operations (e.g. delete remote files). However, their meanings remain the same.

A simple, and generally low risk, test for the FTP task is to create a remote directory at the root of your FTP server. To do this, select the Operation property, causing a dropdown button to appear. In the dropdown list, select Create remote directory.

alt text here…

This changes the parameters in the dialog box as follows:

alt text here…

Next, select the Remote Path property and click on the ellipses button.

alt text here…

This opens a directory browser dialog, allowing you to navigate to the desired directory, with the dialog generating the appropriate directory path.

alt text here…

alt text here…

After pressing OK, the selected location is stored in the RemotePath variable.

alt text here…

To create a directory beneath documentation, type /testDir after the end of the path.

alt text here…

Finally, press OK in the FTP Task Editor dialog to store the changes made to the FTP task.

Notice that now, the Error List is empty. If you press F5 to run the task, the directory will be added on your FTP server.

Now that the testDir directory is present, let’s send a file to it. First, open the FTP task again and set Stop on Failure to false.

alt text here…

If the directory already exists, we’ll continue to the next task.

Second, add an additional FTP task and link the tasks with a precedence constraint.

alt text here…

Third, double click on the Precedence Constraint to open the Precedence Constraint editor. Inside, switch the Evaluation Value from Success to Completion. Then press OK to save the changes.

alt text here…

Fourth, double click on FTP Task 1 to open its editor dialog. In the General group, set the FtpConnection to the previously created FTP Connection Manager.

Then switch to the File Transfer group. For the RemotePath property, use the same directory path that was created in the first FTP task. The notion is that the initial task creates the directory, if necessary, so the second task can copy files to it. Notice that I also set OverwriteFileAtDest to true, so newer versions of the file overwrite older ones.

alt text here…

In the Local Parameters group, select the LocalPath property, open its dropdown, and select New Connection…

alt text here…

This opens the File Connection Manager dialog, where you can browse to find the file to be sent, or type its path directly in the File textbox.

alt text here…

Once entered, press OK to dismiss the dialog.

Once complete, the dialog properties are set as follows:

alt text here…

Press OK to store your changes and then press F5 to run your package. This time, the directory will be created if it’s missing and then, the file will be sent to the directory on your FTP server.

If you’re wondering about sending multiple files, it can be done. In fact, you would use the same approach as when using the File System task to copy multiple files; use the For Each File enumerator to enumerate across files in a directory. See the Foreach File loop post for a demonstration.

Receiving multiple files from your FTP server doesn’t require a container task since the FTP task will send all files in the specified remote path to your specified local path.

It’s also worth noting that the FTP task supports the * and ? wildcard characters in paths. However, they can only be used in filenames. For instance, C:documentation*.txt won’t work but C:documentation.txt will.

Using Mist:

Now, let’s create the same FTP tasks in Mist.

To start, you’ll need to create a package. If you’re unfamiliar with how to create a package in Mist, you can follow the first two steps in the Mist User Guide’s Creating a Package topic.

Once your package is open, navigate to the Toolbox tool window.

alt text here…

Select and then drag the File System task onto the Package’s design surface.

alt text here…

The task has a red X icon, just like in BIDS, since it has errors.

Look below the package designer and select the Package Details tab to bring the Package Details tool window to the front. Within the Package Details tool window, you can edit the FTP task’s properties. The non-modal nature of the tool window allows you to make other changes in your package or project without having to cancel out of dialogs first.

alt text here…

In the BIDS sample, I started by creating a FTP task that creates a remote directory. To do that in Mist, open the Operation dropdown and select Create Remote Directory.

alt text here…

Of course, this task requires a connection to the FTP server. If you click on the FTP Connection dropdown in the Package Details pane, you’ll see that it only lists (No Selection). Thus, you’ll need to create a FTP connection. You can follow the steps in the Mist User Guide’s Creating New Connection section to create a connection, although you’ll want to make a FTP connection instead of an Ole Db connection.

The designer for the FTP connection has one red bordered field, clearly indicating that you need to provide a server name.

alt text here…

Matching the FTP Connection Manager editor dialog from the BIDS sample, provide an appropriate server name, along with any required authentication values.

Now, reopen the package and select the FTP task to view its details. In the FTP Connection combo box, you can now select the FTP connection.

alt text here…

The next step is to enter the remote directory path inside the Remote Path text box. The Method dropdown is set to Direct to indicate that the remote path value is being entered as text, as opposed to being provided via a variable.

To match the BIDS sample, also uncheck the Stop on Operation Failure checkbox.

alt text here…

Next, you need to create the second FTP task, which will send a file to the FTP server. To begin, drag and drop another FTP task onto the package design surface.

alt text here…

Then, drag a line from FTP Task 1’s bottom node to FTP Task 2’s top node. This will create a precedence constraint.

alt text here…

alt text here…

Notice that when the precedence constraint is created, a green line appears with an S. The letter S and the green color indicate that the precedence constraint’s evaluation value is success. To change that, click on the precedence constraint to select it. The package details area will change, displaying a data grid that lists the constraint, as you’re now seeing the precedence constraint’s details.

alt text here…

Double click on the Evaluation Value cell to display its combo box. Open the combo box and select Completion to change the constraint’s evaluation value.

alt text here…

Once done, the precedence constraint will become blue and its label’s letter will switch to C.

alt text here…

Then, click on FTP Task 2. This updates the Package Details window with the task’s settings. You can begin by selecting FtpConnection1 in task’s FTP Connection dropdown.

alt text here…

Also, enter the same remote path, as FTP Task 1, in the Remote Path text box.

alt text here…

Next, you will need to create a File connection, to specify the file being sent. You can follow the steps in the Mist User Guide for Creating a Connection but remember to make a File connection.

The designer for the File connection has one red bordered field, clearly indicating that you need to provide a file path.

alt text here…

Click on the ellipses button, to the right of the red border, to display an Open File dialog to help you find your file.

Matching the BIDS sample, provide an appropriate file path for your file connection. Also, feel free to rename the connection so it’s easy to identify.

alt text here…

Now, reopen the package and select FTP Task 2 and view its details again. In the Local Path’s File Connection combo box, you can now select FtpFileConnection.

alt text here…

With that, you’re now ready to build your project and open the generated assets in BIDS to run the package.

Next Time:

I’ll be continuing with the theme of Data Preparation tasks.

Links:

FTP documentation for the BIML language

FTP task - MSDN

-Craig

Thumbnail

How To - Tasks and Transformations: Message Queue Task

Written by Varigence Blog on 7.12.2011

My previous post returned to the theme of workflow tasks by discussing the SSIS Execute Package task, along with what Mist brings to the table. It’s now time to finish off the workflow tasks with the Message Queue task.

Background:

Microsoft Message Queuing (MSMQ) is a message queue implementation that’s existed in Windows operating systems for over 15 years. Its intent is to allow applications running on separate servers to communicate in a failsafe manner. From Wikipedia: “MSMQ is responsible for reliably delivering messages between applications inside and outside the enterprise. MSMQ ensures reliable delivery by placing messages that fail to reach their intended destination in a queue and then resending them once the destination is reachable.”

In terms of packages, MSMQ can be useful when two independent packages need to communicate with one another. For instance, if the outcome of one package’s execution needs to be communicated to another package. Of course, it’s possible to write to a message queue from any windows application; synchronization is not limited to just between packages.

In this post’s example, I’m going to walk through using the MSMQ task in BIDS and Mist to receive messages from the message queue. The scenario I’m picturing is that you have some application that performs required processing before your package should run. Thus, your package must wait to receive a message that processing is finished before executing.

Installing MSMQ:

To work through this sample, the computer where you run it must have MSMQ installed. If you’re unsure if MSMQ is installed, you can check by right clicking on My Computer and selecting Manage.

alt text here…

This opens the Computer Management console. Expand the Services and Applications node and check if a Message Queuing item is present. If it’s present, then MSMQ is installed. Otherwise, you need to install it.

At the end of this article, I've included a link to instructions for installing message queuing on Windows 7 and Server 2008 R2. Additionally, you can search online for installation instructions for other Windows operating systems.

Getting Started:

Because this sample involves messaging, and messages require a sender and receiver, I’ve written a very simple C# application that sends a message to a message queue. I’ve built the sample using Visual Studio 2010 although it can be run in .NET 2.0 or later.

The message generator application uses a hard-coded string for a message queue name, following the pattern of serverNameprivate$queueName. serverName is set to “.” to indicate the local server. The queue name is arbitrarily set to MsmqSample.

The message being sent is a simple text message of “My Msmq Sample”. The message also needs its label to be set so the MSMQ task can determine if the message matches one of its message types. In this case, the message’s Label property is set to “String Message.” See this MSDN forum post for more details.

When running, the application first checks if the message queue exists. If it does, it will send a message. If the queue doesn’t exist, it’s created and then the message is sent.

To confirm the application works, right click on My Computer and select Manage in the context menu. This opens the Computer Management console. Notice that under Services and Applications, there’s a Message Queuing directory.

Now, run the message generator application to send a message. To confirm the message was sent, right click on Private Queues and select Refresh. A new queue named msmqsample should appear. Expand its node to reveal its child items and select the Queue messages item.

alt text here…

Notice that in the center list, there is a single message. If you right click on the message and select Properties, a Properties dialog appears with detailed information about the message. If you select the Body tab, you’ll see the message’s body, confirming the intended message was sent.

alt text here…

With that working, it’s now time to discuss receiving the message within a package.

Using BIDS:

To get started with the Message Queue task, first find it in the Toolbox tool window.

alt text here…

Next, drag and drop the task onto the package’s design surface.

alt text here…

The red X icon indicates the task has errors. The Error List displays the following:

alt text here…

From the error list, it’s clear that the task requires a connection, which makes sense since it needs to be notified when a message is added to the message queue.

Double clicking on the Message Queue task brings up its properties dialog.

alt text here…

If you click inside the MSMQConnection field, you’ll see a down arrow button. Click on it to open a dropdown that invites you to create a new connection.

alt text here…

Click on “New connection” to open the MSMQ Connection Manager Editor.

alt text here…

The Path property needs to match the message queue path used in our message generator application, since this package will receive messages sent by the application. Enter the path in the Path text field.

alt text here…

The Test button verifies that the path is valid. Once a path is correctly entered, press OK to dismiss the dialog.

Next, in the Message property’s field, click on its value to open a popup. Switch the message’s type to Receive message.

alt text here…

As a result, in the dialog’s left list box, you can now click on the Receive item, which replaced the Send item.

alt text here…

The Receive page has several properties:

RemoveFromMessageQueue indicates whether to remove the message from the queue once it’s received. I suggest setting this to true to avoid having to remove messages manually.

TimeoutAfter lets you set a timeout, in seconds, for how long the task will wait if no message is received. The related ErrorIfMessageTimeOut property indicates whether an error message is displayed if timeout is reached.

The MessageType property indicates the message format that’s expected. You can choose from:

Format Description Data file message The message is in a data file Variable message The message is stored in a package variable String message The message is a string in the queue String message to variable The message is a string that will be saved to a variable The Compare property is set to None by default. However, you can set Compare to: Exact match, Ignore case, or Containing. These compare types indicate that the string message must meet the compare criteria for the Message Queue task to process them. Otherwise, the message is ignored.

With task set-up complete, press OK to dismiss the Message Queue Task editor dialog box. Build and Run the package in BIDS and, while it’s running, run the Message Generator application. The application will send a message to the queue. Your task should respond by turning green, as it successfully processes the message.

Using Mist:

To start, you’ll need to create a package. If you’re unfamiliar with how to create and open a package, you can follow the first couple steps in the Mist User Guide’s Creating a Package topic.

Once you’ve opened the package, navigate to the Toolbox tool window and find the Message Queue task.

alt text here…

Note that in the past, you’d have to search the treeview for the task you’re looking for. However, in an upcoming release, the Toolbox tool window has a search box so it’s easy to find the task you want.

Select and drag the Message Queue task onto the Package’s design surface.

alt text here…

The task has a red X icon indicating that it has errors.

alt text here…

The errors explain that Mist requires you to provide a MsmqConnection in the Message Queue task. To accomplish that, return to the logical view tool window, right click on the Connections group, and add a MSMQ connection.

alt text here…

Double click on the connection to open the connections editor.

alt text here…

In the Path field, enter the message queue path.

alt text here…

Then, reopen the package and select the Message Queue task. Click on the Package Details tab to see the properties for the task.

alt text here…

Looking at the details tool window, you can see that Mist mirrors the properties available in BIDS. Additionally, there are separate areas for both Send Message and Receive Message.

To assign the connection to the Message Queue task, open the MSMQ Connection dropdown and select MsmqConnection1.

alt text here…

To indicate that a received message should be removed from the queue, simply check the Remove from Queue checkbox.

Finally, right click on the package in the Logical View and select Build & Open in BIDS. That command will build your package and then open it in BIDS. You can then run the Message Generator and see that the task succeeds.

Samples:

The BIDS and Mist examples demonstrated above, along with the Message Generator application, can be downloaded here.

Links:

Message Queue documentation for the Biml language

Message Queue task - MSDN

Install Message Queuing on Windows 7 and Windows Server 2008 R2

Craig

Thumbnail

How To – Tasks and Transformations: Execute Package Task

Written by Varigence Blog on 6.30.2011

In my last post, I demonstrated the SSIS Foreach File loop container task in conjunction with the File System task. This week, however, I want to return to the theme of workflow tasks and discuss the Execute Package task.

Background:

The Execute Package task has a straightforward purpose; allow a package to run other packages in its workflow. This enables packages to be reusable, modular components that can be shared among workflows. The package being run by the Execute Package task is usually referred to as the child package; the package running the Execute Package task is the parent.

The scenario I’m going to demonstrate is an archiving situation where one package archives files while another monitors available disk space on the archive drive, informing someone if free space falls below some threshold.

Using BIDS:

I’m going to start by creating the package that monitors free space. This will be the child package that’s run by the Execute Package task. To monitor free disk space, I’m going to use a WMI Event Watcher task. If the watcher indicates that free space has fallen below a specified threshold, a Send Mail task will notify someone.

The main thing I want to address here is using a variable to control the free space threshold. As mentioned above, an advantage to using the Execute Package task is reusability. By parameterizing the threshold value, multiple packages can reuse this child package, each with their own free space minimums. The variable is going to be embedded in the WMI Event Watcher’s WQL query.

Note that if you’re looking for in-depth discussion on creating the tasks for this package, please review my past posts on the WMI Event Watcher and Send Mail tasks.

The first step is to create the variable. To do that, right click on the package designer’s background to open a context menu.

alt text here…

In the context menu, select Variables, which opens the Variables tool window.

alt text here…

Next, press the Add Variable button to create a variable.

alt text here…

Notice that the variable’s scope is DiskSpaceWatcher, which is the name of my package.

I recommend renaming the variable to something clearer, like FreeSpaceThreshold.

alt text here…

In my original post on the WMI Event Watcher task, I authored the WQL query using the Direct input source type. This time, I’m going to use an expression so I can include the FreeSpaceThreshold variable in it.

After adding a WMI Event Watcher task, double click on it to open its editor dialog, and select the Expressions item on the left side.

alt text here…

Click on the Expressions expander to expose an Expressions property. Click inside the Expressions field to display an ellipses button.

alt text here…

Click on the ellipses button to open the Property Expressions Editor dialog box.

alt text here…

This dialog allows you to map a task property to an expression. Clicking in the Property cell exposes a combo box; clicking a second time opens the combo box.

alt text here…

Select the WqlQuerySource property in the drop down.

alt text here…

Notice that the Expression cell has an ellipses button. Click on it to open the Expression Builder dialog.

alt text here…

One advantage/disadvantage of the BIDS Expression Builder dialog is that it dutifully informs you if the expression you entered can be processed.

alt text here…

While certainly helpful to know that during design, it’s simultaneously annoying since you’re not allowed to save the Expression text until you fix it. I’ll show you later how Mist improves on this.

The next screenshot shows the expression you should enter for this example.

alt text here…

Some key things to notice in the expression are:

It’s surrounded with quotes The quotes, around Win32LogicalDisk and C:, are escaped with a backslash The FreeSpaceThreshold variable is appended to the expression using the + concatenation operator The variable is cast to DTWSTR. After entering a valid expression, you can use the Evaluate Expression button to see the evaluated value. Notice that the FreeSpaceThreshold variable appears as 0 in the Evaluated value text.

You can now dismiss the dialog, and save the expression text, by pressing OK. You’ll be returned to the WMI Event Watcher Task editor. Interestingly, the WqlQuerySource property is still blank, despite setting an expression. Furthermore, trying to close this dialog will display an error that the WQL query is missing.

alt text here…

To solve this, you can enter some text into the WqlQuerySource field. I entered select. Afterwards, you can dismiss the dialog. If you reopen it, you’ll see that the WqlQuerySource field is populated with the evaluated WQL query expression.

The final workflow for this package looks like:

alt text here…

If the WMI Event Watcher task succeeds, indicating the threshold has been exceeded, the Send Mail task runs, notifying the appropriate person that I’m running out of archiving disk space. If the task times out, it will be treated as a failure and the package will stop executing.

With the child package laid out, it’s time to implement the parent package. To start, find the Execute Package task in the Toolbox.

alt text here…

Next, drag and drop the task onto the package’s design surface.

alt text here…

The red X icon indicates the task has errors.

alt text here…

Double clicking on the Execute Package task brings up its dialog. Clicking on the Package menu item in the left ListBox shows the properties specific to this task.

alt text here…

The Execute Package task needs to know where the package to-be-executed is. You can choose between a package that resizes in a SQL Server msdb database or a package anywhere in the file system. For this sample, it makes sense to choose the file system. To do that, click inside the Location field and open the dropdown to select File system.

alt text here…

Once File system is selected, the PackageName field beneath it becomes the PackageNameReadOnly field, and is disabled. In the File system case, BIDS determines the package name once you select the package in the file system. IF you use a SQL Server page, then you’d need to provide the package name explicitly.

The next step is to click inside the Connection field. In the dropdown, select the New connection.

alt text here…

Selecting the new connection brings up the File Connection Manager Editor dialog, where you can setup the connection.

alt text here…

If you click on the Usage type dropdown, you’ll notice that your only choice is existing file.

Clicking on the Browse button provides a dialog to find your package on disk. Once selected, its full path is entered in the File textbox.

alt text here…

Click OK to finish creating the connection to the package file.

The next property is the Password property. This lets you enter a password for a password protected child package. As this sample doesn’t call for a password, you can skip it.

ExecuteOutOfProcess is the final property. If set to true, then the child process will run in a separate process from the parent package. When deciding whether to run the parent and child packages in the same process, there are some important considerations:

Memory Having a child package run in its own process requires more memory. Depending on your available hardware, that may be a factor. Impact on Child Package failure When a child package runs in its own process, it can fail without impacting the parent process. If you want the child and parent packages to fail together, then the child package should be run in the same process as the parent. For this sample, you can leave ExecuteOutOfProcess set to false.

alt text here…

With that, the Execute Package task is ready to execute another package.

As to the archiving workflow, you can use a Foreach Loop Container whose folder path is the directory of files to be archived. The File System task can use a move file operation to move each file from the directory to the archive location. You can review my previous posts on the ForEach File Loop and File System tasks for more details.

The final workflow for the parent package should look like:

alt text here…

However, there’s still one thing missing; I haven’t shown how to pass the free space threshold value to the child package. That can be done using a Package Configuration.

The first step is to create a variable on the Execute Package task. This variable’s Data Type needs to match the variable on the child package. I’ve chosen to give this variable the same name as the child package’s variable to show the connection between the two, although that’s not required.

alt text here…

Notice that the Data Type is Int64 and the Value field is 104,857,600. This is my free space threshold in bytes, which equals 100 MB.

Next, open the child package, right click in the package designer’s background, and select Package Configurations in the context menu.

alt text here…

This opens the Package Configurations Organizer dialog. Check the Enable package configurations checkbox so you can add a configuration.

alt text here…

Next, click Add to open the Package Configuration wizard.

alt text here…

Press Next to advance to the Configuration Type selection screen.

The first part of configuration creation is selecting the configuration type. The configuration type dropdown lets you choose from several options. You will be using the parent package’s variable so select Parent package variable in the Configuration type combo box.

alt text here…

After selecting the parent package variable configuration type, the name for the textbox beneath the dropdown changes to Parent variable.

alt text here…

This is where you specify the name of the parent variable to read from.

alt text here…

Clicking Next brings you to the next configuration screen, which asks you to select the property whose values will be set by the configuration. In this case, it’s the Value property on the child package’s FreeSpaceThreshold variable.

alt text here…

To set the property, navigate the Objects tree view to the child package’s Variables folder and expand the FreeSpaceThreshold’s Properties list to select the Value property.

Click Next to bring you to the final screen, that confirms the configuration settings.

alt text here…

Click Finish to store the configuration.

With the package configuration complete, you can build and run your project.

Using Mist:

To start, you’ll need to create parent and child packages. If you’re unfamiliar with how to create a package, you can follow the first couple steps in the Mist User Guide’s Creating a Package topic.

Similar to the BIDS sample, I want to focus on variable creation, creating a WQL query in an expression, and setting up the package configuration. If you want more details on the WMI Event Watcher and Send Mail tasks, you can consult my previous blog posts.

For creating the free space threshold variable, open the child package. Click inside the design background to select the package and, in the Packages tab, click on the Variable button.

alt text here…

This creates a variable and brings the Variables tool window to the front.

alt text here…

You can double click on each cell to change the variable’s name and type, and set its value.

alt text here…

For the WQL query, you must first add a WMI Event Watcher task to the package. You’ll also need to add a Send Mail task and a precedence constraint to complete the workflow.

alt text here…

Notice that in the green line for the precedence constraint, there’s also an S character. The S character stands for success; Mist uses both colors and letters to identify the evaluation value for the constraint.

To add an expression, select the WMI Event Watcher task and click on the Expression button in the ribbon.

alt text here…

This opens the Expressions tool window, adding a new expression to the task.

alt text here…

Double clicking on the Property cell opens a combo box that lists all the properties for the task.

alt text here…

As in the BIDS sample, you should select the WqlQuerySource property. Next, you can double click in the Expression cell to access a text field where you can enter your expression.

alt text here…

Alternatively, you’re likely better off clicking on the ellipses button to open Mist’s Expression Builder dialog.

alt text here…

In the Expression Builder, enter the same expression as in the BIDS sample. If you press the Evaluate button, you will see the same result as in BIDS, along with a confirmation that the expression is of type DT_WSTR.

alt text here…

If you enter an invalid expression, the Evaluate will fail but you can still press OK to store the expression. This is really nice if you want to save your progress and fix it later.

alt text here…

After pressing OK to dismiss the dialog, you can now fill in the remaining fields for the WMI Event Watcher and Send Mail tasks. The tasks’ details editors appear as follows:

alt text here…

alt text here…

For the parent package, you will also need to create the same workflow as in the BIDS sample. To start, find the Execute Package task in the Toolbox.

alt text here…

Now, drag and drop it onto the designer surface. You will also need to add a Foreach File Loop task and a File System task.

alt text here…

After selecting the Execute Package task, take a look at the Package Details tool window. The main properties for the Execute Package task match the properties displayed in BIDS.

alt text here…

However, one key difference is that many more input methods are available. While you can choose a File connection or SqlServer connection, you can also select a table, package, or package emit target. Since both packages are in the same project for this sample, it makes sense to choose Package in the dropdown. In the Package dropdown beneath it, you can select the child package.

alt text here…

As a quick review, this is the Package Details tool window for the Foreach File Loop container. Here, we are iterating over all files in C:Orders and its subdirectories. Each file path is stored in a ForeachFilePath variable during the iteration.

alt text here…

In the Package Details tool window for the File System task, the source for the Move File operation is the ForeachFilePath variable. The destination uses a File connection named ArchiveConnection.

alt text here…

Finally, for the Execute Package task, you need to a variable of the same data type as FreeSpaceThreshold on the child package. You can repeat the earlier steps to add a variable to the package.

For setting up the parameter passing, using a variable, you can do something very different, and far easier, than BIDS. Returning to the child package, open the Variables tool window and select its FreeSpaceThreshold variable. In the variable’s Parent Package Config textbox, enter User::FreeSpaceThreshold (or whatever name you assigned the parent package’s variable).

alt text here…

That’s it! No configuration wizard or dialog boxes.

Links:

Execute Package task – MSDN

Execute Package documentation for Biml

Craig

Thumbnail

Defining the Data Flow in Biml

Written by Varigence Blog on 6.15.2011

This post is part 5 of a series on using Biml in BIDS Helper. This post builds on some of the information and the sample from the previous posts.

In the previous post in the series, I talked about controlling the order of execution in the control flow. In this post, the focus will be on the dataflow, and controlling how the data in the pipeline flows from one component to the next. This post uses a new table as the target of the data flow, so you may want to review Part 2: Creating Tables using Biml and BimlScript to see how to create the table locally. The Biml below describes the table. You can create it in the database of your choice – I used a database named Target.

<Biml xmlns="http://schemas.varigence.com/biml.xsd">
    <Connections>
        <OleDbConnection Name="Source" ConnectionString="Provider=SQLNCLI10;Server=.;Initial Catalog=AdventureWorksDW2008R2;Integrated Security=SSPI;"/>
        <OleDbConnection Name="Target" ConnectionString="Provider=SQLNCLI10;Server=.;Initial Catalog=Target;Integrated Security=SSPI;"/>
    </Connections>
    <Tables>
        <Table Name="DimAccount_Test" ConnectionName="Target">
            <Columns>
                <Column Name="AccountKey" />
                <Column Name="ParentAccountKey" IsNullable="true" />
                <Column Name="AccountCodeAlternateKey" IsNullable="true" />
                <Column Name="ParentAccountCodeAlternateKey" IsNullable="true" />
                <Column Name="AccountDescription" DataType="String" Length="50" IsNullable="true" />
                <Column Name="AccountType" DataType="String" Length="50" IsNullable="true" />
                <Column Name="Operator" DataType="String" Length="50" IsNullable="true" />
                <Column Name="CustomMembers" DataType="String" Length="300" IsNullable="true" />
                <Column Name="ValueType" DataType="String" Length="50" IsNullable="true" />
                <Column Name="CustomMemberOptions" DataType="String" Length="200" IsNullable="true" />
            </Columns>
        </Table>
    </Tables>
</Biml>

With the table created, we can move on to the interesting part – transforming the data. In a simple, straightforward data flow, the Biml compiler will do most of the work for you. Take this data flow as an example:

<Dataflow Name="Dataflow 1">
    <Transformations>
        <OleDbSource Name="Source" ConnectionName="Source">
            <DirectInput>SELECT * FROM dbo.DimAccount</DirectInput>
        </OleDbSource>
        <OleDbDestination Name="Target" ConnectionName="Target">
            <ExternalTableOutput Table="dbo.DimAccount_Test"/>
        </OleDbDestination>
    </Transformations>
</Dataflow>

In this case, you don’t have to specify any data paths. The Biml compiler will infer that the OleDbSource’s output should be connected to the input of the OleDbDestination.

alt text here…

The compiler is able to do this by using default outputs. In Biml, most components have a default output defined. In the absence of other information, the compiler will automatically connect the default output of a transformation to the input of the next component defined in the Biml. So, if we use a slightly more complex data flow, like this:

<Dataflow Name="Dataflow 2">
    <Transformations>
        <OleDbSource Name="Source" ConnectionName="Source">
            <DirectInput>SELECT * FROM dbo.DimAccount</DirectInput>
        </OleDbSource>
        <Lookup Name="Check For Existing" OleDbConnectionName="Target" NoMatchBehavior="RedirectRowsToNoMatchOutput">
            <DirectInput>SELECT AccountKey FROM dbo.DimAccount</DirectInput>
            <Inputs>
                <Column SourceColumn="AccountKey" TargetColumn="AccountKey"/>
            </Inputs>
        </Lookup>
        <ConditionalSplit Name="Test ID Range">
            <OutputPaths>
                <OutputPath Name="High ID">
                    <Expression>AccountKey >= 100</Expression>
                </OutputPath>
            </OutputPaths>
        </ConditionalSplit>
        <OleDbDestination Name="Target" ConnectionName="Target">
            <ExternalTableOutput Table="dbo.DimAccount_Test"/>
        </OleDbDestination>
    </Transformations>
</Dataflow>

We end up with a data flow that still automatically connects data paths between components. In this case, though, it’s probably not doing exactly what we want, since it’s just connecting the default outputs. The Conditional Split (“Test ID Range”) in this example is connected by the default output, but we want to use the”High ID” output to filter out IDs less than 100. In the case of the Lookup (“Check For Existing”), the default output being used is the “Match” output, but we only want the non-matched records, so that only new rows are inserted.

alt text here…

I explicitly choose the option BIDS to display the path Source Names for this screenshot – by default, they aren’t displayed in the generated package. You can change the setting in BIDS by selecting the path, opening the Properties tool window, and changing the PathAnnotation property to SourceName.

So how would we change the Biml to get the desired results? If we add an InputPath element to the appropriate components, we can control which output is tied to the component’s input. In this case, we need to add explicit InputPath instructions to the Conditional Split (that will reference the Lookup’s NoMatch output) and to the OleDbDestination (which will reference the ConditionalSplit’s High ID output).

<Dataflow Name="Dataflow 3">
    <Transformations>
        <OleDbSource Name="Source" ConnectionName="Source">
            <DirectInput>SELECT * FROM dbo.DimAccount</DirectInput>
        </OleDbSource>
        <Lookup Name="Check For Existing" OleDbConnectionName="Target" NoMatchBehavior="RedirectRowsToNoMatchOutput">
            <DirectInput>SELECT AccountKey FROM dbo.DimAccount</DirectInput>
            <Inputs>
                <Column SourceColumn="AccountKey" TargetColumn="AccountKey"/>
            </Inputs>
        </Lookup>
        <ConditionalSplit Name="Test ID Range">
            <InputPath OutputPathName="Check For Existing.NoMatch"/>
            <OutputPaths>
                <OutputPath Name="High ID">
                    <Expression>AccountKey >= 100</Expression>
                </OutputPath>
            </OutputPaths>
        </ConditionalSplit>
        <OleDbDestination Name="Target" ConnectionName="Target">
            <InputPath OutputPathName="Test ID Range.High ID"/>
            <ExternalTableOutput Table="dbo.DimAccount_Test"/>
        </OleDbDestination>
    </Transformations>
</Dataflow>

This gives you the following data flow.

alt text here…

That’s a few examples of controlling the data paths in a data flow. There are a few other bits of information that are important to know about data paths in the data flow.

Most components have a default output named “Output”, and a second output named “Error” for the error output (if the component supports errors). The Multicast component has no default output, so you always need to explicitly define the data path mapping from it to the next component. The Union All, Merge, and Merge Join components need to be explicitly mapped, since they support multiple inputs. The Slowly Changing Dimension (SCD) transformation has multiple outputs. The “New” output is the default. There are also outputs named “Unchanged”, “FixedAttribute”, “ChangingAttribute”, “HistoricalAttribute”, and “InferredMember”. The Percentage Sampling and Row Sampling transformations have two output named “Selected” (the default) and “Unselected”. The sample Biml for this post is on my SkyDrive. Please download it and try it out with the latest release of BIDS Helper.

Thumbnail

Controlling the Control Flow in Biml

Written by Varigence Blog on 6.13.2011

This post is part 4 of a series on using Biml in BIDS Helper. This post builds on some of the information and the sample from the previous posts.

So far, we’ve looked at some relatively simple packages, in terms of their flow. In this post, we’re going to look at how to handle more complex control flow in Biml.

One feature of Biml is the ConstraintMode property that’s part of packages and containers. This property controls how precedence constraints are generated in the control flow. In the simple case, if you want all tasks to be run in sequence, you can set the ConstraintMode to Linear. This causes the package to be produced with all tasks connected sequentially by Success precedence constraints, in the order they were specified in the Biml. So, the following Biml:

<Biml xmlns="http://schemas.varigence.com/biml.xsd">
    <Packages>
        <Package Name="Control Flow Sample 1" AutoCreateConfigurationsType="None" ConstraintMode="Linear">
            <Tasks>
                <Dataflow Name="Task 1"/>
                <Dataflow Name="Task 2"/>
                <Dataflow Name="Task 3"/>
            </Tasks>
        </Package>
    </Packages>
</Biml>

results in a package that looks like this:

alt text here…

This type of linear flow is great in some situations, but sometimes you need more control. In those cases, you can change the ConstraintMode to Parallel. The generated package will not have any automatically created precedence constraints, so it will look like this:

alt text here…

Once a container is in Parallel constraint mode, you can start adding explicit precedence constraints. Let’s use an example to highlight this. Imagine you have a package that needs to run three data flows. (I’m using data flows for the example because they are simple to read in Biml, and I want the focus to be on the constraints, not the tasks.) I want one data flow to execute first – this data flow will be named “Me First”. If “Me First” succeeds, the data flow named “Me Next (Success)” should execute. The third data flow, named “I’m Last (Always)” should always be executed, regardless of the success or failure of the other two tasks. So my package should look like this:

alt text here…

So how do we get this output from Biml? We can use the PrecedenceConstraints collection on each task. At it’s simplest, you just add an Input to the collection, and reference the output of the task that should execute prior to this one. In Biml, all tasks have a built-in output named Output. You can reference it using TaskName.Output (“Me First.Output” in the example below). This will create a regular, Success constraint between the tasks.

<Dataflow Name="Me Next (Success)">
    <PrecedenceConstraints>
        <Inputs>
            <Input OutputPathName="Me First.Output"/>
        </Inputs>
    </PrecedenceConstraints>
</Dataflow>

For the next set of constraints, we want to use the OR logical type, using the LogicalType property, for the constraints, since either of them should cause the third task to run. We also need to explicitly set the evaluation value on these, using the EvaluationValue property.

<Dataflow Name="I'm Last (Always)">
    <PrecedenceConstraints LogicalType="Or">
        <Inputs>
            <Input OutputPathName="Me First.Output" EvaluationValue="Failure"/>
            <Input OutputPathName="Me Next (Success).Output" EvaluationValue="Completion"/>
        </Inputs>
    </PrecedenceConstraints>
</Dataflow>

You can also add expression constraints to the Inputs, to control whether tasks run based on the results on an expression. You use the EvaluationOperation and Expression properties to configure that.

<Package Name="Control Flow Sample 3" AutoCreateConfigurationsType="None" ConstraintMode="Parallel">
    <Variables>
        <Variable Name="Continue" DataType="Int32">0</Variable>
    </Variables>
    <Tasks>
        <Dataflow Name="Task 1"/>
        <Dataflow Name="Task 2">
            <PrecedenceConstraints>
                <Inputs>
                    <Input OutputPathName="Task 1.Output" EvaluationOperation="Expression" Expression="@Continue==1"/>
                </Inputs>
            </PrecedenceConstraints>
        </Dataflow>
    </Tasks>
</Package>

That Biml results in a package that looks like this.

alt text here…

That’s how to control the precedence constraints. I’ve uploaded the Biml from this post to my SkyDrive here, so you can download and experiment with this yourself. In the next post, we’ll look at controlling the data paths in a data flow.

John Welch [cross-posted from http://agilebi.com/jwelch/2011/06/13/controlling-the-control-flow-in-biml/]

Thumbnail

Copy Data Dynamically with BimlScript

Written by Varigence Blog on 5.31.2011

This post is part 3 of a series on using Biml in BIDS Helper. This post builds on some of the information and the sample from the previous posts.

BimlScript enables some interesting scenarios for generating large numbers of SSIS packages automatically. This can come in handy when you need to copy data most or all of the data in one database to a different one. In this case, you could use something like theTransfer SQL Server Objects task, but it has a few problems. You can roll your own, but that might mean a fair amount of custom scripting. Or you could use the Import / Export Wizard. But in all these cases, you don’t have complete control of how the packages are produced. You could create all the packages by hand, which does give you full control, but then you are stuck doing a lot of repetitive work in SSIS.

BimlScript provides an alternative that lets you fully control the output, while automating the rote work of producing lots of packages that use the same pattern. Let’s take a look at a sample of this, using the scenario above (copying the data from one database to another).

<#@ template hostspecific="true"#>
<#@ import namespace="System.Data" #>
       
<Biml xmlns="http://schemas.varigence.com/biml.xsd">
      <Connections>
            <OleDbConnection ConnectionString="Provider=SQLNCLI10;Server=.;Initial Catalog=AdventureWorksDW2008R2;Integrated Security=SSPI;"/>
            <OleDbConnection ConnectionString="Provider=SQLNCLI10;Server=.;Initial Catalog=Target;Integrated Security=SSPI;"/>
      </Connections>
      <Packages>
            <# 
                string metadataConnectionString = "Provider=SQLNCLI10;Server=.;Initial Catalog=AdventureWorksDW2008R2;Integrated Security=SSPI;";
                DataTable tables = ExternalDataAccess.GetDataTable(metadataConnectionString, 
                    "SELECT '[' + s.name + '].[' + t.name + ']' FROM sys.tables t INNER JOIN sys.schemas s on t.schema_id = s.schema_id");
                foreach (DataRow row in tables.Rows)
                { #>
            <Package ConstraintMode="Linear" AutoCreateConfigurationsType="None">
                  <Tasks>
                        <Dataflow> 
                              <Transformations>
                                    <OleDbSource ConnectionName="Source">
                                          <DirectInput>SELECT * FROM <#=row[0]#></DirectInput>
                                    </OleDbSource>
                                    <OleDbDestination ConnectionName="Target">
                                          <ExternalTableOutput Table="<#=row[0]#>"/>
                                    </OleDbDestination>
                              </Transformations>
                        </Dataflow>
                  </Tasks>
            </Package>
                <# } #>
      </Packages>
</Biml>

This script is set up to copy all the data in the AdventureWorksDW2008R2 database to a second database named Target (very inventive, I know). One note – the script is not creating the tables in the target database. We could actually automate that portion as well, but it’s beyond the scope of this post. To ensure you are set up properly to run this script, you should create an exact structural copy of your source database under a different name. You can use the Generate Scripts Wizard to do this. Just script the entire database, and then update the generated script to use a different database name (don’t forget to change the USE statement to the new name).

The script will produce a package per table, with a simple data flow that copies all the data using an OLE DB Source and OLE DB Destination. The script leverages the metadata already contained in the database, in the sys.tables view, to drive the loop that creates the packages.

What if you don’t want to select all the rows from each table? Instead, perhaps you want to specify a WHERE clause to use to filter some of the tables. To handle this, we can create a table in the target database that holds our WHERE information.

<Biml xmlns="http://schemas.varigence.com/biml.xsd">
    <Connections>
        <OleDbConnection ConnectionString="Provider=SQLNCLI10;Server=.;Initial Catalog=Target;Integrated Security=SSPI;"/>
    </Connections>
    <Tables>
        <Table ConnectionName="Target">
            <Columns>
                <Column DataType="String" Length="255"/>
                <Column DataType="String" Length="4000"/>
            </Columns>
        </Table>
    </Tables>
</Biml>

You can use the steps shown in Part 2 of this series to create this table in the Target database. Once it’s been created, populate it with some data. Note that since we are using the schema-qualified name of the table, you’ll need to specify that in the table. There’s an example of data for this table that will work with AdventureWorksDW2008R2 below. This will filter the rows down to only sales where the amount is greater than 1000.

TableName   SelectSql
[dbo].[FactInternetSales]   
WHERE [SalesAmount] >= 1000
[dbo].[FactResellerSales]   
WHERE [SalesAmount] >= 1000

Now we need to alter the script to use the new information in this table. At the beginning of the block of script after the element, add the following code:

string targetConnectionString = "Provider=SQLNCLI10;Server=.;Initial Catalog=Target;Integrated Security=SSPI;";
DataTable whereClauses = ExternalDataAccess.GetDataTable(targetConnectionString, "SELECT TableName, WhereSql FROM WhereClause");

This retrieves the WHERE clauses from the WhereClause table, and stores them in the whereClauses variable.

Next, replace the line in the OleDbSource with this:

<# 
  var dataRow = whereClauses.Select(string.Format("TableName = '{0}'", row[0]));
  string whereSql = dataRow.Length == 0 ? string.Empty : dataRow[0][1].ToString();    
  string sql = string.Format("SELECT * FROM {0} {1}", row[0], whereSql);
#>
<DirectInput><#=sql#></DirectInput>

This code determines whether the whereClauses table has a row for the current table. If it does, it appends it to the end of the SELECT statement. The complete, final script looks like this:

<#@ template hostspecific="true"#>
<#@ import namespace="System.Data" #>
       
<Biml xmlns="http://schemas.varigence.com/biml.xsd">
      <Connections>
            <OleDbConnection ConnectionString="Provider=SQLNCLI10;Server=.;Initial Catalog=AdventureWorksDW2008R2;Integrated Security=SSPI;"/>
            <OleDbConnection ConnectionString="Provider=SQLNCLI10;Server=.;Initial Catalog=Target;Integrated Security=SSPI;"/>
      </Connections>
      <Packages>
            <# 
                string targetConnectionString = "Provider=SQLNCLI10;Server=.;Initial Catalog=Target;Integrated Security=SSPI;";
                DataTable whereClauses = ExternalDataAccess.GetDataTable(targetConnectionString, "SELECT TableName, WhereSql FROM WhereClause");
                 
                string metadataConnectionString = "Provider=SQLNCLI10;Server=.;Initial Catalog=AdventureWorksDW2008R2;Integrated Security=SSPI;";
                DataTable tables = ExternalDataAccess.GetDataTable(metadataConnectionString, 
                    "SELECT '[' + s.name + '].[' + t.name + ']' FROM sys.tables t INNER JOIN sys.schemas s on t.schema_id = s.schema_id");
                foreach (DataRow row in tables.Rows)
                { #>
            <Package ConstraintMode="Linear" AutoCreateConfigurationsType="None">
                  <Tasks>
                        <Dataflow> 
                              <Transformations>
                                    <OleDbSource ConnectionName="Source">
                                        <# 
                                            var dataRow = whereClauses.Select(string.Format("TableName = '{0}'", row[0]));
                                            string whereSql = dataRow.Length == 0 ? string.Empty : dataRow[0][1].ToString();    
                                            string sql = string.Format("SELECT * FROM {0} {1}", row[0], whereSql);
                                        #>
                                          <DirectInput><#=sql#></DirectInput>
                                    </OleDbSource>
                                    <OleDbDestination ConnectionName="Target">
                                          <ExternalTableOutput Table="<#=row[0]#>"/>
                                    </OleDbDestination>
                              </Transformations>
                        </Dataflow>
                  </Tasks>
            </Package>
                <# } #>
      </Packages>
</Biml>

You can see the results of this script by right-clicking on the Biml file, and choosing Expand. It may take a minute or two to process, but when it finishes, you should see a package for each table in your source database. The data flows will copy the data from Source to Target, and any WHERE clauses you add to the WhereClause table will be used.

There’s a lot more that could be done with this script (automating the recreation of the tables in the destination, or deleting existing data, for example), but it’s still a good example of what BimlScript can do. Instead of spending your time writing 10s or 100s of repetitive packages, automate it with BimlScript.

[cross-posted from http://agilebi.com/jwelch/2011/05/31/copy-data-dynamically-with-bimlscript/]
Thumbnail

Creating Tables using Biml and BimlScript

Written by Varigence Blog on 5.26.2011

This post is part 2 of a series on using Biml in BIDS Helper. This post builds on some of the information and the sample from the previous posts.

When I’m creating samples for SSIS, I often find it necessary to create supporting tables to go along with the package sample. One of the things I like about Biml is that you can define both your tables and packages in the language. Here’s an example of defining an OrderHeader and OrderDetail table in Biml:

<Biml xmlns="http://schemas.varigence.com/biml.xsd">
    <Connections>
        <OleDbConnection Name="DbConnection" ConnectionString="Server=.;Initial Catalog=Sandbox;Provider=SQLNCLI10.1;Integrated Security=SSPI;"/>
    </Connections>
    <Tables>
        <Table Name="OrderHeader" ConnectionName="DbConnection">
            <Columns>
                <Column Name="OrderId" DataType="Int32" IdentityIncrement="1" IdentitySeed="1"/>
                <Column Name="SalesDate" DataType="DateTime"/>
                <Column Name="CustomerName" DataType="String" Length="50"/>
            </Columns>
            <Keys>
                <PrimaryKey Name="OrderHeaderPK">
                    <Columns>
                        <Column ColumnName="OrderId"/>
                    </Columns>
                </PrimaryKey>
            </Keys>
        </Table>
        <Table Name="OrderDetail" ConnectionName="DbConnection">
            <Columns>
                <Column Name="OrderDetailId" DataType="Int32" IdentityIncrement="1" IdentitySeed="1"/>
                <TableReference Name="OrderId" TableName="OrderHeader"/>
                <Column Name="ProductName" DataType="String" Length="50"/>
                <Column Name="Qty" DataType="Int16"/>
                <Column Name="UnitPrice" DataType="Currency"/>
            </Columns>
            <Keys>
                <PrimaryKey Name="OrderDetailPK">
                    <Columns>
                        <Column ColumnName="OrderDetailId"/>
                    </Columns>
                </PrimaryKey>
            </Keys>
        </Table>
    </Tables>
</Biml>

Tables are defined in a <Table> tag. They can have columns defined, as well as keys, and even indexes (not shown in the example above). Notice that the OrderId column doesn’t have a DataType attribute. Many of the attributes in Biml have default values, and data type is one of them. If it’s not specified, the column data type will default to Int32. The primary key for the table is defined with a <PrimaryKey> element.

The OrderDetail table includes a <TableReference> column. TableReference columns are a special class of columns, that define that this column should have a foreign key reference to another table. This one is referencing back to the OrderHeader table. It’s not shown, but you can also use a MultipleColumnTableReference, if your foreign key needs to span multiple columns.

Great – now you have your tables defined in Biml, but how do you make use of that? If only there were some way to run this against your database to create the tables… Well, fortunately, there is – by using BimlScript. BimlScript is a scripting layer that automates the production of Biml (similar in concept to the way ASP.NET produces HTML). To set this up, you need to add two Biml files to your project – one to hold the table definitions above, and one to hold the BimlScript.

First, add a new Biml file to the SSIS project (see Part 1 if you need a refresher on this). Copy the Biml above to this file, and rename the file to TableDefinitions.biml.

alt text here…

Second, add an additional Biml file. Name this one CreateTables.biml.

alt text here…

Open the CreateTables.biml file, and replace the contents with the following code:

<#@ template language="C#" hostspecific="True" #>
 
<Biml xmlns="http://schemas.varigence.com/biml.xsd">
    <Packages>
        <Package Name="Create Tables" AutoCreateConfigurationsType="None" ConstraintMode="Linear">
            <Tasks>
                <# foreach(var table in RootNode.Tables) {#>
                <ExecuteSQL Name="Create <#=table.Name#>" ConnectionName="<#=table.Connection.Name#>">
                    <DirectInput>
                        <#=table.GetTableSql()#>    
                    </DirectInput>
                </ExecuteSQL>
                <# } #>
                <# foreach(var table in RootNode.Dimensions) {#>
                <ExecuteSQL Name="Create <#=table.Name#>" ConnectionName="<#=table.Connection.Name#>">
                    <DirectInput>
                        <#=table.GetTableSql()#>    
                    </DirectInput>
                </ExecuteSQL>
                <# } #>
                <# foreach(var table in RootNode.Facts) {#>
                <ExecuteSQL Name="Create <#=table.Name#>" ConnectionName="<#=table.Connection.Name#>">
                    <DirectInput>
                        <#=table.GetTableSql()#>    
                    </DirectInput>
                </ExecuteSQL>
                <# } #>
            </Tasks>
        </Package>
    </Packages>
</Biml>

This file has a header at the beginning that indicates the script will use C#. The next section defines a package named “Create Tables”. The section inside the Tasks element is the interesting part:

<# foreach(var table in RootNode.Tables) {#>
<ExecuteSQL Name="Create <#=table.Name#>" ConnectionName="<#=table.Connection.Name#>">
    <DirectInput>
        <#=table.GetTableSql()#>    
    </DirectInput>
</ExecuteSQL>
<# } #>

This code iterates over the tables that are part of the current model. For each table it finds, it creates an ExecuteSQL task, and embeds the SQL to create the table in the package. The code is repeated to iterate over Dimensions and Facts, which are special classes of tables.

Notice that there are no tables defined in the BimlScript file. The BimlScript can’t operate against objects defined in the same file, which is why we created the TableDefinitions.biml file separately. To produce the package, multi-select both TableDefinitions.biml, and CreateTables.biml, right-click, and choose Expand Biml File.

alt text here…

This will produce a new SSIS package in the project named Create Tables.dtsx. It contains two Execute SQL tasks, one for each table.

alt text here…

Each task includes the appropriate SQL to create the tables. As an example, here’s the OrderHeader SQL from the Execute SQL task.

        SET ANSI_NULLS ON
        SET QUOTED_IDENTIFIER ON
        GO
         
        -------------------------------------------------------------------
        IF EXISTS (SELECT * from sys.objects WHERE object_id = OBJECT_ID(N'[OrderHeader]') AND type IN (N'U'))
        DROP TABLE [OrderHeader]
        GO
         
        CREATE TABLE [OrderHeader]
        (
        -- Columns Definition
         [OrderId] int IDENTITY(1,1) NOT NULL
        , [SalesDate] datetime NOT NULL
        , [CustomerName] nvarchar(50) NOT NULL
         
        -- Constraints
        ,CONSTRAINT [OrderHeaderPK] PRIMARY KEY CLUSTERED
        (
          [OrderId] Asc) WITH(PAD_INDEX = OFF,IGNORE_DUP_KEY = OFF) ON [PRIMARY]
         
        )
        ON [PRIMARY]
        WITH (DATA_COMPRESSION = NONE)
        GO
 
        -------------------------------------------------------------------

Note that the tables are ordered in the package in the same order they are defined in the Biml file. If you have tables with dependencies, make sure to order them correctly.

In the next post, we’ll look at some ways to copy data dynamically using BimlScript.

[cross-posted from http://agilebi.com/jwelch/2011/05/26/creating-tables-using-biml-and-bimlscript/]
Thumbnail

How To – Tasks and Transformations: Foreach File Loop

Written by Varigence Blog on 5.19.2011

I left off last time discussing the SSIS File System task. Although that task does enable convenient bulk file operations, it’s limited due to its lack of wildcard support. Say I need a workflow that routinely copies all of the previous hour’s work orders to another location. By itself, the File System task is only helpful if the work orders are already grouped, which is unlikely. To help handle these more complicated operations, SSIS has a Foreach Loop Container task, which I will explore in this post.

Background:

The Foreach Loop Container task allows you to iterate over a list of items and run additional tasks on each item that’s enumerated. This post is going to focus solely on file enumeration, which is one of several enumeration options offered by the Foreach Loop container. In fact, the Foreach Loop container can enumerate all of the following types:

Enumerator Purpose ADO Enumerate rows in tables ADO.NET Schema Rowset List schema information about a data source, such as the list of tables in a SQL Server database File Enumerate files in a folder From Variable Enumerate the enumerable object that a variable contains. For instance, if a variable contains a DataTable or an array, you can enumerate its values. Item List items in a collection, such as the names of executables in an Execute Process task NodeList List the items in a XPath expression result set SMO Enumerate SMO objects, such as the views in a SQL database

In Biml, each enumerator type is associated with its own task. Thus, instead of using the Foreach Loop Container task, you would use the Foreach File Loop task.

The Foreach Loop Container task enumerates each file in a directory, allowing you to run other tasks on those files. This task also allows you to filter the files you want to enumerate using wildcards, whereas the File System task is limited to either all files in a directory or a particular file. That’s the difference that’s going to enable more flexible file operations.

The scenario I’m going to demonstrate is archiving all .csv files in a directory.

Using BIDS:

To get started, you need to find the Foreach Loop Container in the Toolbox.

alt text here…

Drag and drop the task on the package’s design surface. Notice that this task has some differences from the previous tasks I’ve demonstrated. First, there’s no red X icon; in fact, the error list remains empty after adding the task. Second, this is the first task with a beige area underneath its name. The beige area is where you can add other tasks that reside in the container. When the task enumerates over the list of files, it will execute the workflow within its container for each file.

alt text here…

Double click on the task to open its editor dialog.

alt text here…

Click on the Collection list item on the left side of the dialog to display the Collection settings.

alt text here…

By default, the Foreach Loop task uses a file enumerator, which is what you want. Note that if you click on the Foreach File Enumerator value, you’ll see a down arrow button.

alt text here…

Clicking on the arrow button opens a dropdown that lets you choose a different enumerator.

alt text here…

Beneath the Foreach Loop Editor property grid is the Enumerator configuration group. This is where you specify what files to enumerate.

alt text here…

For this scenario, you first need to specify the directory that contains the .csv files to be archived. To do that, enter the folder path in the Folder text box. Note that you can also use the Browse button to display a folder selector dialog to help retrieve the folder path.

alt text here…

This path indicates that the task will enumerate files within the ForEachContainerFiles directory.

With the folder path identified, you now need to specify the files to enumerate. The default value indicates that all files in the folder path will be listed. However, the goal here is to only archive .csv files, skipping files with other types. To specify just .csv files, type in *.csv to specify that file extension. The star character indicates that zero or more characters are acceptable. Be sure to remove the star on the right size as .csv could unintentionally return files as such bob.csvc.

Note that the Files textbox isn’t limited to specifying file extensions. Say you had files with dates embedded in their file name, like sales03-05-11.csv. You could use the Files text box to just archive files with dates in October or files of a particular year.

Beneath the Files text box is the Retrieve file name group box. This group box lets you specify the format in which each file is returned. There are three formats to choose from:

Format Example Fully qualified C:varigenceSamplesT&T BlogForEach ContainerFilesresults1.csv Name and extension results1.csv Name only results1

Since you will be performing file operations, having the fully qualified path will be useful. Thus, click on the Fully qualified radio button to select that format.

Finally, the Traverse subfolders checkbox lets you decide if the Foreach Loop Container will search subfolders, beneath the folder path, for files that meet your criteria.

alt text here…

With the enumerator setup done, you now need to store the current file path as the enumerator proceeds. This can be done using a variable. To set this up, first click on the Variable Mappings list item on the left side of the dialog.

alt text here…

As no variables have been created yet, you’ll need to add one. Do that by clicking inside the Variable cell.

alt text here…

A dropdown button will appear. Click on the dropdown button to open a list that contains a New Variable item.

alt text here…

Click on the New Variable item to open the Add Variable dialog box.

alt text here…

The first change you need to make is the container. This whole Package doesn’t need to know about this variable; just the ForEach Loop container. To change the variable’s container, click on the Container’s down arrow button to open a pop-up.

alt text here…

Expand the tree view nodes in the pop-up to reach the Foreach Loop container that you created. Click on it to select it and then press OK.

Next, I recommend changing the variable’s Name to something clearer, such as CsvFilePath. The other values can remain unchanged.

alt text here…

Press OK to create the variable and dismiss the dialog.

alt text here…

Notice that the CsvFilePath variable has been mapped to a 0 index. While it turns out that this is correct, it’s worth understanding what this means. For the moment, let’s say that instead of enumerating a list of files, you need to iterate over a row set, where each row has some columns.

Name    Street  Zip Code

Row 1 Microsoft 1 Microsoft Way 98052 Row 2 Apple 1 Infinite Loop 95014 Row 3 Intel 2200 Mission College Blvd. 95054

In this case, if you wanted to operate on the Street value for a row, as the enumeration occurs, you would create a variable and map it to index 1. You would map to index 1, instead of index 2, since despite Street being the second column, indexes are zero based. For the file enumerator, despite thinking about it as a list of files, it’s internally treated like a table where each row represents an item in the collection and the table has one column for the file path. Thus, BIDS maps the variable to index 0.

Now you can press OK to dismiss the Foreach Loop editor dialog.

The next step is to place the File System task within the Foreach Loop container so it can run as the list of files is enumerated. To do that, find the File System task in the Toolbox window.

alt text here…

Then drag and drop the File System task on top of the container.

alt text here…

You can review my previous blog post on the File System task for step by step instructions on how to set it up. In brief, start by opening the File System Task editor dialog.

First, change the operation type from Copy file to Move file.

alt text here…

Second, set the Source Connection group’s IsSourcePathVariable to true. That will change the SourceConnection property to SourceVariable. In the SourceVariable dropdown, select User::CsvFilePath.

Third, create a Destination Connection, with its usage type set to Existing Folder. The File Connection’s Folder path should be the directory where the files will be archived.

alt text here…

Your File System task editor dialog should now look like:

alt text here…

You can now try to run your package. But if you do, you’re going to get a build error as the File System task has a red X icon. Checking the error list, you’ll see:

alt text here…

This is because no value was entered for the variable. This is counterintuitive since the whole point of this approach is that the enumerator will populate the variable as it runs. However, the validator isn’t that smart so you’ll need to workaround this bug by giving the variable a value that will be overwritten once the workflow runs. To do that, first click on the Foreach Loop Container task to select it.

alt text here…

Then, open the SSIS menu in the BIDS menu bar and select Variables.

alt text here…

This opens the Variables tool window for the Foreach Loop container. Next, double click on the value cell for the CsvFilePath variable.

alt text here…

In the cell, type in anything, such as Placeholder. Then press enter.

alt text here…

This will resolve the error, although the error list may not immediately reflect it. I had to close my package and reopen it to get the error list to update.

Using Mist:

Now it’s time to discuss implementing a Foreach File loop in Mist.

To start, you’ll need to create a package. If you’re unfamiliar with how to create a package, you can follow the first couple steps in the Mist User Guide’s Creating a Package topic.

Once you’ve opened the package, navigate to the Toolbox tool window.

alt text here…

Notice that there’s a Container group in the tree view. The Foreach File Loop container is within the ForEach sub group. Select and drag the Foreach File Loop container onto the Package’s design surface.

alt text here…

Unlike BIDS, there is a red X icon on the Foreach File Loop container since it has errors.

alt text here…

Mist requires you to provide values for the Folder and FileSpecification properties.

Notice that to the right of the red X icon, there’s a magnifying glass icon. I’ll explain that later in this post.

Click on the Package Details tab to see the tool window’s contents.

alt text here…

Matching the BIDS sample, the first step is to enter the Folder and File Specification for the Foreach File Loop container. You can enter the Folder path in the Folder textbox. You can enter the wildcard, for the files you want enumerated, in the File Specification text box.

alt text here…

The Retrieve File Name format defaults to Fully Qualified so that doesn’t need to change. You can check the Process Subfolders checkbox to enable that behavior.

alt text here…

Next up, you need to create a variable mapping. To do that, go into the ribbon and find the Variable Mapping button.

alt text here…

Notice that the button is in a group named Task Specific. As you select different tasks in a package, the buttons in the Task Specific group will change to be appropriate for the selected task. Press the button to create a variable mapping.

alt text here…

The mapping is added in the Foreach File Loop container’s Variable Parameter Mappings data grid. If you double click on the Parameter Name cell in the mapping’s row, you can change the mapping’s name. If you double click on the variable cell, a combo box will appear, indicating that no variable has been selected.

alt text here…

Clicking on the combo box will bring up a listing of the task’s built-in variables.

alt text here…

Of course, you will want to create a new variable. To do that, return to the ribbon and press the Variable button.

alt text here…

The Variables tool window will be brought to the front of the Mist UI, with a new variable created and already selected. You can change the variable’s name by double clicking on the name cell and then typing in the name you want. To match the BIDS sample, I’ve named the variable CsvFilePath.

alt text here…

Next, return to the package details tool window and in the variable mapping, reopen the Variable combo box. You can now select the CsvFilePath variable.

alt text here…

With that, the Foreach File Loop task is set up.

alt text here…

Notice that in the Variable cell, the variable’s scope is included in the name, meaning that you can easily see the variable is in Foreach File Loop 1, which resides in Package1. Also notice that on the errors tab to the left, there are 0 errors, confirming that every required property has been filled in.

With that done, you can add the File System task within the Foreach File Loop container. Find the File System task in the Toolbox tool window.

alt text here…

Then drag and drop it on top of the Foreach File Loop container.

alt text here…

Now I’m ready to talk about the magnifying glass icon in the Foreach File Loop’s header. If you click on the magnifying glass, you’ll see the following:

alt text here…

The magnifying glass makes the entire design surface dedicated to the Foreach File Loop task. Admittedly, you wouldn’t need this feature for this simple scenario. However, it comes in very handy if you’re implementing a complicated workflow within a container. Instead of having to repeatedly resize the container, or scroll within it, you can easily take advantage of your screen real estate. Returning to the package view is also easy; just click on Package1 in the breadcrumb bar.

alt text here…

To finish the sample, you’ll need to populate the File System task with appropriate values. You can refer to my previous blog post for details on using that task. When done, the File System task’s details editor should look as follows:

alt text here…

If you want to play with this sample, you can download it from here.

More Information:

It’s important to note that if you have a Foreach File Loop where you modify the collection during execution, it can have unintended side effects. Microsoft explicitly warns users that unexpected situations may occur due to files being added, deleted, or renamed during execution. If you’re going to implement a workflow where you process a file and then archive it, you might consider using two Foreach File Loop containers. The first processes each file. The second performs the move operations. That way, there’s no risk that files may not be processed as the enumeration changes due to the file moves.

Links:

Foreach File Loop Element - BIML

Foreach Loop Container - MSDN

Foreach Loop Container – Collection Page - MSDN

Craig

Thumbnail

Creating a Basic Package Using Biml

Written by Varigence Blog on 5.13.2011

This article is going to walk through the process of creating a simple package using Biml and the Biml Package Generator feature in BIDS Helper. To start out, you need to install the latest beta of BIDS Helper from CodePlex. Once that is set up, you should create a new Integration Services project in BIDS. In the project, right-click on the Project in the Solution Explorer. There’s a new item in this menu – Add New Biml File.

alt text here…

Clicking Add New Biml File will add a new file to the Miscellaneous folder in the solution named BimlScript.biml. (The name is automatically generated, so it may be BimlScript1.biml, etc). You can right-click on the file and choose rename to give the file a more specific name. For this example, rename the file “BasicPackage.biml”.

Double-clicking on the file will open the XML editor inside of BIDS. The editor supports Intellisense for Biml, so typing an opening tag (“<”) will give you a list of valid options for tags you can use. (If you aren’t seeing the Intellisense, please check this link for troubleshooting steps.)

alt text here…

For this example, copy and paste the following Biml into the document. Since the code below includes the document root tags (), you’ll want to make sure you replace the entire contents of the Biml file.

<Biml xmlns="http://schemas.varigence.com/biml.xsd">
    <Connections>
        <Connection Name="AdventureWorks" ConnectionString="Server=.;Initial Catalog=AdventureWorks;Integrated Security=SSPI;Provider=SQLNCLI10"/>
    </Connections>
    <Packages>
        <Package Name="Biml Sample" AutoCreateConfigurationsType="None" ConstraintMode="Linear">
            <Tasks>
                <Dataflow Name="Extract Table List">
                    <Transformations>
                        <OleDbSource Name="Get Table List" ConnectionName="AdventureWorks">
                            <DirectInput>SELECT * FROM sys.tables</DirectInput>
                        </OleDbSource>
                        <Multicast Name="Multicast"/>
                    </Transformations>
                </Dataflow>
            </Tasks>
        </Package>
    </Packages>
</Biml>

The first section () of this Biml defines an OleDbConnection that points to the AdventureWorks database. The next section (inside the tag) defines a single package that contains a Dataflow task (the tag). The Dataflow task contains two components, an OleDb Source and an Union All transformation.

The next step is to take this definition of a package, and actually generate the package from it. To do this, right-click on the Biml file, and choose Expand Biml File from the context menu.

alt text here…

A new package will be added to the SSIS Packages folder, named Biml Sample.dtsx. If you review the generated package, you’ll see that it matches up to what was defined in the Biml code.

alt text here…alt text here…

That’s a quick introduction to the Biml functionality in BIDS Helper. In the next article, we’ll set the stage for some more advanced (read: more interesting) uses of Biml, including some scripting.

[cross posted from http://agilebi.com/jwelch/2011/05/13/creating-a-basic-package-using-biml/]

Thumbnail

How To – Tasks and Transformations: File System

Written by Varigence Blog on 5.11.2011

For the past several weeks, I’ve been reviewing SSIS Workflow tasks. Today, I want to take a detour and consider how you can handle file system operations, such as copying files. While you could use the Execute Process task to run a batch file that performs file system operations, there’s also a SSIS task for that.

Background:

The SSIS File System task is a straight forward task that lets you perform several, common file system operations directly in your workflow.

The types of supported operations are as follows:

Operation Description Requires Source Requires Destination Special Properties Copy directory Copies a directory from one location to another X X Copy file Copies a file from location to another X X Create directory Creates a folder in the specified location X X Delete directory Deletes a folder in a specified location X Delete directory content Deletes a folder and all of its contents X Delete file Deletes a file at a specified location X Move directory Moves a directory form one location to another X X Move file Moves a file one location to another X X Rename file Renames a file at a specified location X X Set attributes Sets attributes on files and folders. X X

In this post’s examples, I will walk through the properties for all the operation types, including the special properties that are unique to the Create directory and Set attributes operations.

Using BIDS:

To get started with the File System task, first find it in the Toolbox tool window.

alt text here…

Next, drag and drop the task onto the package’s design surface.

alt text here…

The red X icon in the task indicates that there are errors. Consulting the error list, you’ll see the following:

alt text here…

From the errors, it’s clear that the task’s default operation is to copy a file. Naturally, the task expects valid source and destination paths to perform the copy.

Double clicking on the File System task brings up its properties dialog.

alt text here…

If you click inside the Operation field, you’ll see a down arrow appear to the right of Copy File. Clicking on the arrow will list all the operation types you can choose from.

alt text here…

One thing in common with all of these operations is that they all require a source connection. The source connection is the file or folder that’s being changed. In this case, it’s the file being copied. For a delete file operation, it’d be the file being deleted.

The dialog’s Source Connection group lists two properties: IsSourcePathVariable and SourceConnection.

alt text here…

The IsSourcePathVariable is a boolean value that tells the task whether the source file path can be found in a variable or file connection. If you use the property’s dropdown to set the property to true, you’ll see that the SourceConnection property is renamed to the SourceVariable property in the dialog.

For this example, I am going to stick with a connection. If you select the SourceConnection property, you’ll see a down arrow button. Click on it to open a dropdown that lists any existing file connections along with the option to make a new connection.

alt text here…

Since no file connections currently exist, click on to create one in the File Connection Manager editor.

alt text here…

This editor allows you to reference an existing file or folder. Since this task is performing a copy file operation, using Existing file makes sense. In the File text field, you can enter the absolute path to the file you want to copy. You can also press the Browse button to display a Select File dialog.

alt text here…

You can navigate to the file and press Open. The path to the file is then added in the dialog for you.

alt text here…

Once you’ve selected your path, you can press OK to dismiss the dialog.

When using the copy file operation, you also need to specify where the file will be copied to. The properties in the dialog’s Destination Connection group let you set that up.

alt text here…

Similar to the Source Connection group, the Destination Connection group has a IsDestinationPathVariable property and a DestinationConnection property. The IsDestinationPathVariable property lets you specify a variable that holds the destination path. The third property, OverwriteDestination, is another boolean property. Setting it to true causes the task to forcibly place the item you’re copying at the specified path, even if another file or folder is already there.

For the destination, try setting IsDestinationPathVariable to true.

alt text here…

You’ll see that the DestinationConnection property becomes DestinationVariable. If you select the DestinationVariable property, you’ll see a down arrow button. Click on it to open a dropdown that lists any existing variables along with the option to make a new one.

alt text here…

You can click on New variable… to create a new variable. When you do this, the Add Variable dialog will appear.

alt text here…

You can keep most of the defaults. However, I do recommend changing the name to something more helpful. Additionally, you must enter a directory path, where the file will be copied to, in the Value textbox.

alt text here…

Notice that there’s unfortunately no Browse button to help you find your file. Once you’re done, press OK to dismiss the dialog.

With that done, you’re ready to actually copy a file.

alt text here…

If you try to run the task in BIDS, you’ll see that the file is indeed copied from C:DailyDatadata.csv to C:DataStoragedata.csv.

As aforementioned, all of the File System task’s operations require a source and half of them need a destination. The two operations with some variation are Create directory and Set Attributes.

Create directory has a unique property - UseDirectoryIfExists. This property handles the scenario where you attempt to create a directory at a path where a directory already exists. If the property is set to true, the directory that already exists will remain, as if it had been created.

alt text here…

Set Attributes lets you set four attributes on files or folders in the file system.

alt text here…

File attributes are metadata values that are used by the operating system. The attributes you can set are:

Property Description Hidden The file or directory cannot be seen in a directory listing. ReadOnly Applications can read the file, but not write to it or delete it. This attribute is ignored for directories. Archive Typically indicates that a file or folder is used for backup or removal. System A file or directory used by the operating system.

Using Mist:

Now, let’s see how to do the above in Mist.

To start, you’ll need to create a package. If you’re unfamiliar with how to create a package, you can follow the first couple steps in the Mist User Guide’s Creating a Package topic.

Once your package is open, navigate to the Toolbox tool window.

alt text here…

Select and then drag the File System task onto the Package’s design surface.

alt text here…

The task has a red X icon, just like in BIDS, since it has errors.

Inside the Package Details tool window, located below the package designer, you’ll be able to edit the File System task’s properties.

alt text here…

For the Copy File operation, there are Source and Destination groups, similar to the BIDS dialog. The Source method default is a File connection. However, if you click on the File Connection dropdown, you’ll see that it only contains a value for (No Selection). Thus, you’ll need to create a connection. You can follow the steps in the Mist User Guide’s Create New Connection section to create a connection, although you’ll want to make a File connection instead of an Ole Db connection.

The designer for the File connection has one red bordered field, clearly indicating that you need to provide a file path.

alt text here…

Clicking on the ellipses button, to the right of the red border, displays an Open File dialog to help you find your file.

Matching the BIDS sample, provide an appropriate file path for your file connection. Also, feel free to rename this connection so it’s easy to identify, and to make it easy to use elsewhere.

alt text here…

Now, reopen the package and select the File System task to view its details again. In the Source group’s File Connection combo box, you can now select DailyDataFileConnection.

alt text here…

In the Destination group, to match the BIDS sample, you’ll need to assign a variable that contains the destination path. To do that, go to the Method dropdown and select Variable.

alt text here…

Once you select Variable, the Destination group UI changes to provide a Variable dropdown.

alt text here…

If you open the dropdown, you’ll see all the System variables that already exist. To create a new one, click in the designer’s white background area to re-select the Package. Then, in the ribbon, click on the Variable button.

alt text here…

This adds a Variable to the package. When you press the Variable button, the Variables tool window comes to the front and the new variable is selected.

alt text here…

To match the BIDS sample, double click in the variable’s Value cell and enter the folder path. You can also double click in the variable’s Name cell to change its name.

alt text here…

Next, reopen the package and select the File System task to view its details. In the Destination group, you can now select your variable from the Variable dropdown. Also, you can check the Overwrite Destination check box to enable that capability.

alt text here…

If you switch the Operation to Set Attributes, via the Operation combo box, you’ll see an Attributes group with four checkboxes.

alt text here…

Similar to the BIDS dialog, you can check or uncheck each of these to control which attributes to set.

Next Time:

One important limitation for this task is that it only operates on a single file or directory. Thus, if you want to perform file operations using a path with wildcards, such as C:Logs2009*.csv, this task can’t help you. Next time, I’ll demonstrate how to overcome this deficiency.

Links:

File System documentation for the BIML language

File System task - MSDN

File Attributes - MSDN

Craig

Thumbnail

How To - Tasks and Transformations: Execute Process

Written by Varigence Blog on 5.5.2011

For the past three weeks, I’ve been demonstrating how to construct and use SSIS workflow related tasks. Continuing with that theme, I’m going to walk through the Execute Process task.

Background:

The Execute Process task provides flexibility in your package workflow, by allowing you to run a Windows application or batch script. There are tons of scenarios where this may be useful. Common ones include compressing or expanding a file before using it further in your workflow; for instance, you may have a file generated by the WMI Data Reader task that you want to compress before sending it to someone via the SendMail task. Alternatively, if you need custom information that can’t be obtained via an existing SSIS task, you could use Execute Process to run an application that generates the data you need, and then use that data in your workflow.

For this post’s example, I’m going to implement an Execute Process task in BIDS that executes a sample application. While this is very similar to executing an application that will zip or unzip a file, it gives me an opportunity to point out differences.

Using BIDS:

As always, the first step is to find the task in the Toolbox.

alt text here…

Then, drag and drop the task onto the package’s design surface.

alt text here…

The red X icon the task indicates the task has errors.

alt text here…

Double clicking on the Execute Process task brings up its task editor dialog. Clicking on the Process menu item in the left ListBox will let you see the Execute Process task specific properties.

alt text here…

The two key properties in this task are Executable and Arguments; both are located near the top of the dialog. Both properties relate to running a command in a command prompt. For instance, if you want to extract a file from a zip file, you could:

Open a command prompt Use the cd command to change the directory to C:Program Fileswinzip Then run wzunzip.exe /e C:docsfileToUnzip.zip C:unzippedFilesDir where C:docsfileToUnzip.zip is the zip file and C:unzippedFilesDir is where the extracted file should be placed. To emulate this with the Execute Process task, you would set the Executable property to C:Program Fileswinzipwzunzip.exe and set the Arguments property to /e C:docsfileToUnzip.zip C:unzippedFilesDir. Notice that the Executable property takes an absolute path to the executable.

For my sample, I have a small Visual Basic application that I am going to run with the Execute Process task. Of course, you can run any Windows application using this task.

To set the Executable property, click inside the Executable property’s text field. One option here is to simply enter the full path to your executable.

alt text here…

If you prefer a graphical approach, or need to search for the executable, click on the ellipses button to display a dialog box that will let you select an executable.

alt text here…

Related to the Executable property are the RequireFullFileName and WorkingDirectory properties. RequireFullFileName lets you fail the task if the executable isn’t found at its specified location. I am keeping its default value of true since I want to be notified if the executable is missing. The WorkingDirectory property lets you specify a folder where the executable can be found. I prefer to leave WorkingDirectory blank and provide a fully qualified path to the executable in the Executable property.

The next step is to enter your arguments.

alt text here…

The Arguments property uses a plain text field. My sample application takes two arguments, which I’ll enter here. My sample application reads files that contain customer order information, filters out some data, and then stores the remaining data in another directory. Thus, both my arguments are file paths.

alt text here…

Although this works great for static arguments that won’t change, you may be wondering how to create arguments that can change each time the task is run. For example, let’s say you had an application that created log files and it took a date as an argument. You could create an argument, in the Execute Process task, that always passes in the current date as a string.

To do that, start by clicking on the Expressions item in the task dialog’s left ListBox.

alt text here…

Next, click on the Expander symbol to the left of the Expressions item.

alt text here…

That replaces the Expander box with an Expressions property and a field with an ellipses button. Click on the ellipses button to display the Property Expressions editor dialog box.

alt text here…

Using this dialog, you can assign SSIS expressions to various properties on this task. To set an expression on the Arguments property, click inside the selected Property cell.

alt text here…

The cell becomes a ComboBox. If you click on the down arrow, a dropdown list appears; select the Arguments property.

alt text here…

Next, click on the ellipses button at the end of the row to open the Expression Builder dialog box.

alt text here…

Here, you can author a SSIS expression whose value will be evaluated at runtime. Since I want to provide the date to an application, as a string, I would enter the following expression:

alt text here…

Notice that instead of entering a SSIS expression, I can also assign a Variable to an argument. Finally, if I need multiple arguments, one option would be to use an expression that uses multiple variables, separated by spaces.

Note that the Expression Builder dialog has some significant deficiencies. When entering an expression, you don’t get Intelliprompt support, and there is no syntax highlighting for your expression. Later on in this post, I’ll demonstrate how Mist has improved this experience.

Returning back to the Execute Process dialog, next up are the StandardInputVariable, StandardOutputVariable, and StandardErrorVariable properties. These all enable you to interact with an application’s input and output streams.

For example, my sample application prints the name of each file that’s processed. To store that output in a file on the command line, I would run SampleApp.exe > outputFile.txt. Using the Execute Process task, that output can be stored in a variable.

To do that, you need to assign a variable to the StandardOutputVariable property. Begin by clicking inside the field for the StandardOutputVariable property.

alt text here…

Next, click on the arrow button to open a dropdown.

alt text here…

Select the New variable… menu item to open the Add Variable dialog box.

alt text here…

This dialog allows you to set various properties on your new variable:

Name Description Container Indicates the scope of the variable. By selecting Package1, this variable becomes available to all containers, tasks, and event handlers within the package. Name Specifies a name for the variable Namespace SSIS provides two namespaces, System and User. The System namespace contains system variables and the User namespace contains custom variables. However, you are also allowed to create additional namespaces. Value type Specifies the type of literal that the variable stores. Note that the variable’s value must be of the value type. For example, having a variable value of abc with a value type of DateTime won’t work. Value The value that the variable stores. In order to make the purpose of this variable clearer, rename it to ProcessOutput. You can leave the remaining properties as-is.

alt text here…

Press OK to return to the task editor dialog box.

alt text here…

If the process you’re running prints to the error stream, you can repeat the above using the StandardErrorVariable property to capture error output in another variable.

As you may expect, the StandardInputVariable property lets you interact with a process’s standard input stream. On the command line, this is the equivalent of SampleApp.exe < inputFile.txt. For the task, instead of having input be stored in a file, the input would be placed in a variable’s Value field, and that variable would be assigned to the StandardInputVariable property.

Along with properties that control the executing process, the Execute Process task also lets you control what happens once the executable is running and how the executable’s behavior affects your workflow.

alt text here…

The SuccessValue property lets you specify the value that’s returned by your executable to indicate success. The default value is 0 and that’s the typical success value for most applications.

By setting the FailTaskIfReturnCodeIsNotSuccessValue property to true, the Execute Process task will compare the process’s return code to the value in SuccessValue. If they don’t match, the Execute Process task will fail.

For my scenario, I definitely want to be aware if my process fails so I will leave this property assigned to true.

The TimeOut property indicates how long your process can run, in seconds. The default value is 0, meaning the process can run indefinitely and won’t be stopped. If you set a value for TimeOut and the process is still running at that point, the process is allowed to finish but your task then fails.

If set to true, the related TerminateProcessAfterTimeOut property will cause the running process to be terminated if it runs past the task’s TimeOut value. You may have noticed that this property is initially disabled; it becomes enabled if you set a positive value for the TimeOut property.

If my process runs unexpectedly long, I want to be notified immediately. Thus, I am going to set the TimeOut property to 300, giving my process 5 minutes to run. To do that, I can simply click inside the TimeOut property’s text field and type in my value.

alt text here…

The final property in the dialog is the WindowStyle property, which controls how the process’s main window will appear when it’s executed.

Value Description Normal The process’s window will appear normally. Maximized The process’s window will be maximized. Minimized The process’s window will be minimized. Hidden You won’t be able to see the process’s window. Since I’m storing the process’s output in a variable, I am going to choose to hide my process’s window.

You can select a different window style by clicking inside the WindowStyle property field. A down arrow button will appear; click on it to open a dropdown.

alt text here…

You can then select a WindowStyle within the dropdown.

And with that, the Execute Process task is now ready to run.

alt text here…

Using Mist:

Now let’s see how to implement the Execute Process task in Mist.

To start, you’ll need to create a package. If you’re unfamiliar with how to create a package, you can follow the first couple steps in the Mist User Guide’s Creating a Package topic.

Once your package is open, navigate to the Toolbox tool window.

alt text here…

Select and drag the Execute Process task onto the Package’s design surface.

alt text here…

This task has a red X icon, just like in BIDS, since it has errors.

alt text here…

In past posts, at this point, I would point out the Package Details tool window and start entering values for the appropriate task properties. However, today I want to demonstrate the Biml editor.

Biml stands for Business Intelligence Markup Language, and is a XML language for describing BI assets. As you create assets in the Mist UI (packages, tasks, variables, expressions, cubes, tables, etc…), Mist generates Biml for each asset, which is saved when saving your project.

In Mist, the Biml editor is, by default, layered behind the designer.

alt text here…

If you click on the Package1.biml tab, it brings the Biml editor to the front.

alt text here…

Note that if you click and drag the Biml editor tab to the right, you can have the designer and Biml editor side-by-side.

alt text here…

In the Biml editor, start by clicking inside the ExecuteProcess element, right before its close tag element, e.g. /> Press your spacebar to see an Intelliprompt pop-up of all properties for the ExecuteProcess task.

alt text here…

Double click on the Executable property to insert it in the ExecuteProcess element.

alt text here…

Within the Executable attribute’s quotes, you can enter the path to your executable. Note that if you need to use the open dialog to find your executable, that’s available in the Package Details tool window via the ellipses button to the right of the Executable property’s text field.

alt text here…

Next, you can hit space bar, to the right of the end quote for the Executable path, to again display the properties list for the ExecuteProcess task.

alt text here…

If you single click on the Arguments property, notice that a pop-up appears that provides a brief description of the property’s purpose. Press the enter key to insert the Arguments property in the ExecuteProcess element.

alt text here…

You can type in your process’s arguments between the quotes.

alt text here…

To create a variable for the StandardOutputVariable property, click on the designer tab’s to return to the package’s designer.

alt text here…

With the ExecuteProcess task selected, click on the Variable button in the Mist ribbon.

alt text here…

This will add a variable to the ExecuteProcess task and will bring the Variable tool window to the front of the Mist UI.

alt text here…

The newly created variable is already selected. Matching the BIDS sample, rename this variable to ProcessOutput by double clicking on the variable’s Name cell and entering the new variable name.

alt text here…

Now, click on the Biml editor tab and notice that a Variables collection has been added in the Biml.

alt text here…

To tie this variable to your task’s OutputVariable, display the Intelliprompt pop-up again and double click on StandardOutputVariableName. Within the attribute’s quotes, enter the variable’s name.

alt text here…

You may have noticed that Mist filters already set properties from the Intelliprompt pop-up, to remove unnecessary items.

Matching the BIDS sample, you can keep using Intelliprompt to insert the Timeout property and set its value to 300, and then insert the WindowStyle property and set it to Hidden.

alt text here…

At this point, you could right click on Package1 in the Logical View tool window and run Build & Open in BIDS to build your Mist project and then open it in BIDS to run it.

alt text here…

Now, if you’re wondering about using an expression for the Arguments property, you can certainly do that in Mist as well.

To do that, first remove the Arguments property in the Biml editor.

alt text here…

Then, return to the package designer and in the Mist ribbon, click on the Expression button.

alt text here…

This creates an expression in the ExecuteProcess task and brings the Expressions tool window to the front of the window.

alt text here…

Double click on the expression’s Property cell to display a ComboBox. Click on the ComboBox to display its pop-up.

alt text here…

Select the Arguments property. Next, double click inside the property’s Expression cell.

alt text here…

Press the ellipses button to display Mist’s Expression Builder dialog.

alt text here…

One thing you’ll notice is the resize gripped down the middle of the dialog, and the additional resize gripped between the Expression and Result areas. This dialog is designed to give you lots more flexibility so you can focus on the areas you intended to work, and minimize the portions you don’t need.

If you click inside the Expression text area and start typing, you’ll notice that we immediately provide an Intelliprompt pop-up, along with a tooltip that describes what you’re entering. Also notice that expression text has color highlighting.

alt text here…

As I type in my expression, notice that I continue to see Intelliprompt.

alt text here…

If I press enter to select the DT_WSTR cast, it’s inserted in the Expressions text box with placeholders for the other values that are expected; in this case, the string’s length.

alt text here…

Once you’ve finished entering your expression, you can press the Evaluate button to obtain the expression’s value.

alt text here…

Notice that we display the expected and actual type of the expression’s value.

Once your expression is entered, press OK to dismiss the dialog. In the Biml editor, you’ll see that the expression is added, although formatted poorly.

alt text here…

To fix the formatting, go into the Biml Editor tab in the Mist ribbon, and press the Format Document button.

alt text here…

The Format Document button alters the document’s indentation to make it easier to read.

alt text here…

Extras:

One topic I didn’t cover here is getting the exit code from the application being run. You can check out this Stack Overflow post for details on how to do that.

Links:

Execute Process documentation for the Biml language

Execute Process task – MSDN

Craig

Thumbnail

Drillthrough in Vivid

Written by Varigence Blog on 5.4.2011

Vivid provides the ability to do two things with drillthrough that are not supported by SQL Server Analysis Server (SSAS) or Excel: (1) Drillthrough on a calculated measure and (2) drillthrough in the presence of multiselect filters. In this entry you’ll learn about how Vivid successfully handles both of these.

Drillthrough on Calculated Measures

Drillthrough on calculated measures is difficult to get right for a variety of reasons (http://www.biblogs.com/2008/09/01/drillthrough-on-calculated-measures/). For example, assume the following calculation:

MEMBER [Measures].[Calculation1] AS  
   [Measures].[Internet Order Count] / [Measures].[Customer Count]

Doing a drillthrough on a calculated measure needs to return all of the rows that correspond to all of the measures that constitute the calculation. In this case we have the [Internet Order Count] and [Customer Count] measures. Vivid does drillthrough, on a calculation with a series of steps we outline here:

  1. Parse and decompose the calculation, finding all of the measures in the calculation. Note that if there are calculations within the calculation then Vivid will recursively decompose the calculation.
    1. In this example there are two measures [Internet Order Count] and [Customer Count], with no calculations to recursively decompose.
  2. Find the measure groups associated with each measure.
    1. [Customer Count] comes from the measure group Internet Customers and [Internet Order Count] from Internet Orders.
  3. Perform a drillthrough on each distinct measure group, requesting as part of the drillthrough result set all of the measures that were used in the calculation (from that measure group).
    1. In the above example, Vivid will perform two drillthroughs, as there are two distinct measure groups.

Thus doing a drillthrough on a calculation can result in multiple drillthroughs performed, and therefore multiple sheets created (each drillthrough gets its own sheet). A single drillthrough will only be performed if all the measures in the calculation are from the same measure group.

While this works great much of the time, it does have some limitations.

  1. Vivid parses the MDX expression, but for complex expressions it may be the case that a given measure doesn’t contribute any value for a given tuple. Vivid will still return a sheet to correspond to that measure/measure group.
  2. Vivid always performs the drillthrough on the current coordinate. If the calculation is retuning data from a different coordinate, e.g., using ParallelPeriod, then Vivid’s drillthrough will return a potentially different set of data than expected. So be mindful when your calculations are internally manipulating coordinates. One useful feature in Vivid is that you can hover over a calculation in the pivot table editor to see its expression.

Drillthrough with MultiSelect Filters

Multiselect filters in Excel also block the standard SSAS drillthrough command. Vivid can handle them, but it is useful to understand how Vivid does them, in order to get the best performance.

https://varigencecomstaging.blob.core.windows.net/blogimages/MultiSelectFilter.png

First, it is important to note the inability to drillthrough on multiselect filters is an SSAS limitation, which effects Excel as a client. Given that the way Vivid performs a drillthrough with multiselect filters is by avoiding actually performing a drillthrough with multiselect filters. This is best demonstrated by explaining the steps Vivid goes through for mulitselect filters.

  1. Find all of the multiselect filters and store the selected values. We will use them later.
  2. Perform a drillthrough on the value, but first remove all of the multiselects from the query. Thus the drillthrough is on a query that has only single select filters (or no filters at all).
  3. Filter the result from step 2 using the stored selected values from step 1.

Note that step 2 can result in a drillthrough result set that is much larger than the final result set. In theory, you could get back from SSAS a million rows (or more), but the final result set is empty (after filtering on the client). For this reason, do try to use as many single selects as possible, as each of them constrains the drillthrough on the server. Another ramification of this is, by default, Vivid does MAXROWS 10000 on drillthrough, so you may get back a dialog requesting to do a drillthrough that doesn’t cap MAXROWS, yet the final result set you receive might be smaller than 10,000.

It’s worth noting that Vivid treats slicers as a multiselect filters.

Lastly, the presence of multiselect filters does require that you specify the drillthrough columns the associated measure group. The reason for this requirement is that we need to return a custom set of drillthrough data, in order to properly filter (from step 3). Rather than underspecifying the columns that the user wants returned, we simply require the user to specify exactly what they’d like returned. Custom drillthrough columns is an extremely useful feature on its own, so the forced discovery is really serendipity.

https://varigencecomstaging.blob.core.windows.net/blogimages/DrillthroughColumnEditor.png

Kang Su Gatlin

Thumbnail

Replacing template columns with EditingElementStyle

Written by Varigence Blog on 4.29.2011

For a long time, I assumed that whenever I needed a DataGridCell to display a custom control, my only solution was to use a DataGridTemplateColumn. And admittedly, a DataGridTemplateColumn is a reasonable way to display custom cells. However, I recently discovered an alternative approach for creating custom cells, using a DataGridColumn's EditingElementStyle property, that helped me out and definitely warrants discussion.

My scenario was that the Expression DataGrid in Mist needed a custom text cell for an Expression's expression property. Along with a text field, I wanted to provide a button to display the expression builder dialog box. I wrote a custom control for the cell, which is basically a Grid with two columns; the left column is the TextBox and the right column contains the button.

alt text here…

My initial approach was to use a DataGridTemplateColumn to display the custom cell:

alt text here…

However, it turns out that this approach has weaknesses. For example, let's say I'm making an edit in an External Property cell. I then press the tab key to switch to the Expression cell so I can continue my edits. However, when I press another key, to being entering my expression, nothing happens. I end up needing to press a second key to begin entering input. Another annoyance occurs if I tab to the Expression cell and press the spacebar. No edits occur on the first press, as before, but subsequent spacebar presses are also ignored. While I'm sure these focus issues could be fixed with some work, they're still annoying. And spending hours debugging WPF focus oddities isn't my favorite activity.

The alternative is to go back to using the humble DataGridTextColumn and customize it using its EditingElementStyle property:

alt text here…

Visually, this looks the same as using the template column solution. However, this approach keeps tab navigation and keyboard focus working as desired. It's worth noting that ComboBox and CheckBox columns can also use this technique.

Craig

Thumbnail

How To – Tasks and Transformations: WMI Event Watcher

Written by Varigence Blog on 4.21.2011

In my previous post, I gave an example of using the WMI Data Reader task to query for information about a computer’s hard drive, and write that information to a file on disk.

In this post, I’m going to demonstrate the related WMI Event Watcher task.

Background:

Whereas the WMI Data Reader task returns data about the system, the WMI Event Watcher task detects changes in your system. Like the WMI Data Reader task, the WMI Event Watcher task also uses WQL to describe the event(s) it’s listening for. You might use the task to determine:

When a file is added to a folder When a server’s hard disk free space drops below a certain percentage When a server’s free memory drops below a certain percentage When an application is installed Although discussing all the intricacies of WQL queries is beyond the scope of this post, I do want to show a couple examples.

To detect when a file is added to a folder, say C:WMIFileWatcher, the query would be:

SELECT * FROM InstanceCreationEvent WITHIN 10 WHERE TargetInstance ISA "CIMDirectoryContainsFile" and TargetInstance.GroupComponent= "Win32Directory.Name="c:WMIFileWatcher""

To understand this query, I’m going to start with the WHERE clause. The WHERE clause is checking for a TargetInstance whose type is CIM_DirectoryContainsFile and whose GroupComponent is the path I’m listening for changes in.

CIMDirectoryContainsFile is a WMI class that associates an instance of CIMDirectory and CIMDataFile. Not surprisingly, CIMDirectory represents a file system directory and CIMDataFile represents a logical file. Thus, there’s an instance of CIMDirectoryContainsFile for every file in every directory in your file system. It also means that when a file is added to C:WMIFileWatcher, a new CIM_DirectoryContainsFile instance is created.

If you now look at the SELECT and FROM clauses, you’ll see that I’m selecting all InstanceCreationEvents. _InstanceCreationEvent is a WMI class that represents the creation of a new WMI class instance. So, I’m selecting all WMI instance creation events, looking for the creation of a CIMDirectoryContainsFile instance for the C:WMIFileWatcher directory.

Finally, the WITHIN clause indicates that the task will notify me of all instance creation events, that meet my WHERE clause criteria, every 5 seconds.

Note that there are two other event classes. One is the __InstanceDeletionEvent class, which represents the deletion of a WMI instance:

SELECT * FROM _InstanceDeletionEvent WITHIN 5 WHERE TargetInstance ISA "Win32Process"

This query detects every time a Win32_Process WMI instance is deleted. Effectively, this means it detects every time a Windows process is terminated.

There’s also the __InstanceModificationEvent class. A key difference with this class is that you also have access to a PreviousInstance value, which enables you to compare your PreviousInstance with your TargetInstance.

For this example, I’m going to have the WMI Event Watcher notify me when my machine’s free memory falls below 100 MB. When that happens, I will use a WMI Data Reader task to write out a file that lists all processes running and the physical memory they’re using, to see which processes are hogging system resources.

Using BIDS:

To author the WMI Event Watcher task, start by finding it in the Toolbox.

alt text here…

Drag and drop the task onto the package’s design surface. You’ll see a red X icon within the task, indicating that there are errors.

alt text here…

In the error list, the sole error is:

alt text here…

To set-up the task, and resolve this error, double click on the WMI Event Watcher task. That will bring up the WMI Event Watcher Task Editor dialog.

alt text here…

To start, you can supply the task with a WQL query. The query’s content can come from different sources; the WqlQuerySourceType property lets you select the query’s source. Its dropdown allows you to choose from:

Type Description Direct Input Enter text directly into a property File Connection Specifies a file that contains the content. The file may already exist or may be created at runtime. You might choose to store your queries in a file that you can change outside of BIDS. Variable Create or choose a variable that contains the content. Use Direct Input and then click inside the WqlQuerySource field.

alt text here…

An ellipses button will appear; click on it to open the WQL Query dialog.

alt text here…

This is a simple dialog box that’s useful for entering multiline queries. As mentioned earlier, I want to detect when my system’s free memory falls below 100 MB. To do that, the query is:

alt text here…

The key portion of this query is the second part of the WHERE clause. I’m comparing the FreePhysicalMemory field on my Win32_OperatingSystem instance to 102,400 KB. Within 5 seconds, of each time the FreePhysicalMemory value is modified, this Event Watcher will test if the value has fallen below 102,400 KB.

You may be wondering why I don’t suggest a more robust comparison, such as checking if less than 20% of the system’s memory is available:

SELECT * FROM _InstanceModificationEvent WITHIN 5 WHERE TargetInstance ISA "Win32OperatingSystem" AND (TargetInstance.FreePhysicalMemory / TargetInstance TotalVisibleMemorySize) < 0.2

There are two problems with this approach. First, FreePhysicalMemory and TotalVisibleMemorySize are both uint64 types. So dividing them will produce an integer value, making them inappropriate for a percentage comparison. The second problem is that WHERE clauses in WQL queries are limited. The two allowed formats are:

SELECT * FROM class WHERE property operator constant SELECT * FROM class WHERE constant operator property You’ll notice that you must compare a property to a constant; you can’t compare an expression to a constant.

alt text here…

The next step is to create a WMI connection, so the task knows where to run the query. To do that, you can use the WmiConnection property, which lets you set the target system to run the WQL query on, as well as authentication for the connection. If you click inside the WmiConnection field, you’ll see a down arrow button. Click on the button to open a dropdown.

alt text here…

Inside the dropdown, click on ‘New WMI connection…’ to open the Connection Manager Editor dialog.

alt text here…

This dialog creates a new connection manager for the WMI data reader; the important fields are Server name and Namespace.

Server name identifies the machine to run this query on. Currently, it’s set to localhost. Since I’ll be running this task on my local machine, I’ve checked the Use Windows Authentication checkbox. If I don’t check it and try to run the task in BIDS, I’ll get an error that ‘User credentials cannot be used for local connections.’ Pressing the dialog’s Test button would also indicate the problem.

Namespace is a key field for WMI. The brief explanation is that different namespaces provide different types of functionality, known as classes. The rootcimv2 namespace (cimv2 is an abbreviation for Common Information Model v2) allows you to access most physical and logical components in a system. You can browse additional namespaces and their classes using tools such as WMI CIM Studio and Scriptomatic 2.0.

Once you’ve set up your server name and namespace, press OK in the dialog.

alt text here…

With the WMI connection and WQL query entered, it’s time to consider what happens in your workflow when an event fires.

alt text here…

The ActionAtEvent property offers you two choices, both of which include logging the event notification. Your actual choice is whether an SSIS action is initiated as well. As of this post in the series, I haven’t discussed SSIS events and you won’t need SSIS events for this example, so I suggest you select the Log the event option.

Notice that there are also options for what the WMI Event Watcher does after an event it’s listening for fires. The AfterEvent property lets you choice from three options:

Name Description Return with success Return success when the event is detected Return with failure Return failure when the event is detected Watch for the event again Once the event is detected, continue watching for the event to occur again The success and failure values matter for your workflow; the next task in your workflow may be different depending on which value your Event Watcher task returns. In this case, you can keep the ‘Return with success’ default.

alt text here…

While I’m discussing events, let’s look at the NumberOfEvents property. This property specifies the number of times the task watches for the event before completing. You can keep the default at 1.

Along with limiting the number of times the task listens for an event, you can also control how long the task waits for events using the Timeout property. The Timeout property specifies the maximum number of seconds the task will run; 0 indicates the task will not timeout. If the number of processed events doesn’t reach the NumberOfEvents value before the Timeout’s seconds have elapsed, the task completes due to the timeout.

Notice that there are properties specifically for actions to be taken at and after a timeout, analogous to the properties for events. These are the ActionAtTimeout and AfterTimeout properties. For ActionAtTimeout, you can change its value to simply Log the event. The AfterTimeout value really depends on the scenario. In this example, the task never times out; the task will keep waiting until the system’s free memory falls below 100 MB. Thus, the ActionAtTimeout and AfterTimeout values are meaningless. However, if you were trying to test a machine over a particular period of time, say 3 hours, then you would want to set the Timeout property and you would use ActionAtTimeout to control your workflow, based on whether the event fired or the timeout was reached.

alt text here…

With that done, you can press OK to save your changes and dismiss the dialog. Now you’re ready to add a WMI Data Reader task to complete the example. You can consult my previous post on how to construct that task.

Two key differences from the previous post are:

The query to use for this example is: SELECT Name, ProcessId, WorkingSetSize, PeakWorkingSetSize FROM Win32_Process You might want to change the OverwriteDestination to ‘Append to destination’ With both tasks present in the designer, you now need to connect them to build the workflow. To do that, first select the WMI Event Watcher task.

alt text here…

Then drag the arrow from the WMI Event Watcher Task to the top of the WMI Data Reader task.

alt text here…

When you release your mouse, the arrow will be extended to the WMI Data Reader task.

alt text here…

Double clicking on the arrow opens the Precedence Constraint Editor dialog box.

alt text here…

In this case, the dialog simply confirms that if the Event Watcher catches an event indicating that available memory has fallen below 100 MB, it will return success so that the Data Reader task runs.

With that, your workflow is now complete and this package could be used to detect a low memory situation and report all processes running on your system at that time.

Using Mist:

Now let’s see how to do the above in Mist.

To start, you’ll need to create a package. If you’re unfamiliar with how to create a package, you can follow the first couple steps in the Mist User Guide’s Creating a Package topic.

Once your package is created and opened, navigate to the Toolbox tool window.

alt text here…

Select and drag the WMI Event Watcher task onto the Package’s design surface.

alt text here…

The task has a red X icon because it has errors.

alt text here…

You’ll need to provide values for the WmiConnection and Source properties.

Inside the Package Details tool window, you’ll find the same properties as the WMI Event Watcher dialog in BIDS.

alt text here…

Our first step is to enter the WQL query. In the Source area, there is the Method dropdown, which lets you control the WqlQuerySourceType property. It’s already set to the default value of Direct. If you were to click on the dropdown, you’d see that Mist offers the same options that BIDS provides.

alt text here…

In the text box beneath the dropdown, you can enter the WQL query.

alt text here…

If you click on the WMI Connection dropdown, it only contains a value for (No Selection). Thus, you’ll need to add a WMI Connection to the project. You can follow the steps in the Mist User Guide’s Create New Connection section to create a connection, although you’ll want to make a WMI connection rather than an Ole DB connection.

The designer for the WMI connection has two red bordered fields, clearly indicating what you need to enter.

alt text here…

From the BIDS sample, provide an appropriate namespace and server name, and set Use Windows Authentication to true.

alt text here…

Now, reopen the package and select the WMI Event Watcher task to view its details again. In the WMI Connection combo box, you can now select WmiConnection1.

alt text here…

To match the BIDS sample, you now need to change the ActionAtEvent property to Log Event and you need to change the ActionAtTimeout property to Log Timeout.

alt text here…

To finish the workflow, you should add the WMI Data Reader task; again, you can consult my previous post to see how to construct that task.

Once both tasks are in the package, you’ll need to connect them to create your workflow. Start by clicking on the WMI Event Watcher task’s bottom anchor. Click and drag a line from the bottom anchor to the WMI Data Reader’s top anchor.

alt text here…

Then let go of your mouse.

alt text here…

As you can see, the straight line you dragged is replaced with an arrow. If you click on the arrow, the package details area updates with the Precedence Constraints editor. In this case, the already created input specifies that a precedence constraint is coming from the WMI Event Watcher task and its evaluation value is success.

alt text here…

You can leave this as-is.

With that finished, you can right click on your package and select the Build & Open in BIDS menu item.

alt text here…

This menu item builds the Mist package, and then opens the produced package in BIDS. You can then execute your package in BIDS. Mist is able to do this since the assets it produces are real SSIS assets; Mist-built assets never contain proprietary elements.

Links:

WMI Event Watcher documentation for the BIML language

WMI Event Watcher task – MSDN

WMI Win32 classes

WMI Tasks for Scripts and Applications - This site links to pages that offer script samples of different WMI tasks. Although not exactly the same as WQL, the scripts use WQL and are very readable, so they can be a good source of guidance for learning how to write various queries.

Craig

Thumbnail

Smart Filtering for Pivot Tables

Written by Varigence Blog on 4.19.2011

Not all measures in an SSAS cube or PowerPivot cut all the hierarchies and vice-versa. This is best explained by example. Let’s use the Adventure Works sample cube and Microsoft Excel 2010.

Start with a pivot table that only has Geography in the rows. In the Adventure Works cube the “Internet” measures don’t cut Geography, so if we were to add a measure from the Internet Customers measure group, e.g., Customer Count, we’d see a peculiar result:

https://varigencecomstaging.blob.core.windows.net/blogimages/SmartFilterFull.png

Indeed, we see 18,484 for all of the values for all of the countries. Is that accurate? No, that is the value for the total of all countries, but there is no breakdown by countries in the cube. By default this duplication of values is what you’ll see when you have a measure that doesn’t cut a hierarchy in a pivot table.

Looking at any given hierarchy/measure it’s not too difficult to determine what measures/hierarchies cut it. But as you add additional fields into the pivot table, determining which fields cut those that are in the pivot table becomes increasingly difficult.

Vivid’s SmartFilter fixes this issue with a simple solution. If you have the SmartFilter enabled the pivot table editor will only display those measures which cut all of the hierarchies in the pivot table, and only those hierarchies that cut all of the measures in the pivot table.

https://varigencecomstaging.blob.core.windows.net/blogimages/SmartFilter.png

Here's an example of two pivot table editors side by side. The one of the left has SmartFilter disabled while the one on the right has it enabled. Notice that the pivot table on the right has many measure groups and dimensions hidden as they don't cut all of the measures/hierarchies in the current pivot table. It greatly reduces the complexity and clutter associated with large cubes.

SmartFilter Computation

For determining which measures remain visible, Vivid takes all of the hierarchies that are in the row or column of the pivot table (page filters and slicers are not used for computing SmartFilter function) and places the measure groups that cut each of them into a set for each hierarchy. Thus if there are n hierarchies in the rows/columns then there are n sets of measure groups. Vivid applies set intersection to the n sets which results in a single set that is the set of measures that will be visible in the pivot table editor.

The same rules apply to displaying the hierarchies in the pivot table field list.

Thumbnail

How To – Tasks and Transformations: WMI Data Reader

Written by Varigence Blog on 4.14.2011

After discussing the Send Mail task last time, I’m going to continue the theme of workflow tasks and review WMI Data Reader.

Background:

WMI stands for Windows Management Instructions, and is an infrastructure that lets you manage data and operations on Windows systems. More simply put, WMI provides an interface for interacting with the Windows OS and hardware components. This interface can be used from C/C++, VBA, .NET, PowerShell, and SSIS.

The WMI Data Reader task enables you to use WMI in your package workflow. A common scenario for the WMI Data Reader task is to query a machine for information.

Common examples of the type of information you might retrieve with WMI include:

Data from the system’s event logs Obtain the status and properties of hardware components The applications, and their versions, installed on the system Realize that WMI can go much deeper than the above examples. For instance, you can learn the capacity, type (e.g., SRAM, SDRAM, DDR, DDR-2), and operating status of physical memory in a machine. Or you can retrieve performance data about physical disks, such as how much data the disk is reading and writing per second.

Getting at this data requires using the WMI Query Language (WQL). Microsoft describes WQL as a dialect of SQL and it does look similar to T-SQL. As you’ll see in our example, a basic WQL statement can have a familiar pattern of:

SELECT FROM WHERE

With that in mind, let’s check out how to add a WMI Data Reader in BIDS and Mist.

Using BIDS:

Start by finding the WMI Data Reader task in the Toolbox.

alt text here…

When you drag and drop the task onto the package’s design surface, you’ll see the red X icon, indicating that there are errors.

alt text here…

Opening the error list, you’ll see the sole error is:

alt text here…

To solve that problem, double click on the WMI Data Reader task in the designer, to bring up its properties dialog.

alt text here…

Since BIDS is unhappy with the lack of a query, you can start by adding one. The query’s content can come from different sources; the WqlQuerySourceType property lets you select the query’s source. Its dropdown allows you to choose from:

Type Description Direct Input Enter text directly into a property File Connection Specifies a file that contains the content. The file may already exist or may be created at runtime. You might choose to store your queries in a file that you can change outside of BIDS. Variable Create or choose a variable that contains the content. Use Direct Input and enter a query in the WqlQuerySource field. The example query below returns the free space, capacity, device ID, and description of my computer’s D drive. Naturally, you’ll want to change the drive letter to reflect the machine you’re running this query on.

alt text here…

Now, based on the error list in BIDS, it would appear this is sufficient to make the task ready for use. However, if you were to press OK in the Task Editor dialog and open the error list again, you’d see:

alt text here…

This quirk, of identifying the lack of a connection only after solving the previous error, can be annoying. Later on, I’ll show you how Mist improves on this.

In the meantime, the next step is to create a WMI connection, so the task knows where to run the query. To do that, you can use the WmiConnection property, which lets you set the target system to run the WQL query on, as well as authentication for the connection. If you click inside the WmiConnection field, you’ll see an ellipsis button. Click on the button to open a dropdown.

alt text here…

Inside the dropdown, click on ‘New WMI connection…’ to open the Connection Manager Editor dialog.

alt text here…

This dialog creates a new connection manager for the WMI data reader; the important fields are Server name and Namespace.

Server name identifies the machine to run this query on. Currently, it’s set to localhost. Since I’ll be running this task on my local machine, I’ve checked the Use Windows Authentication checkbox. If I don’t check it and try to run the task in BIDS, I’ll get an error that ‘User credentials cannot be used for local connections.’ Pressing the dialog’s Test button would also indicate the problem.

Namespace is a key field for WMI. The brief explanation is that different namespaces provide different types of functionality, known as classes. The rootcimv2 namespace (cimv2 is an abbreviation for Common Information Model v2) allows you to access most physical and logical components in a system. You can browse additional namespaces and their classes using tools such as WMI CIM Studio and Scriptomatic 2.0.

Once you’ve set up your server name and namespace, press OK in the dialog.

The next step is to choose the format of the returned WMI Data. This is specified in the Output Type property and you can choose from Data Table, Property name and value, and Property value. You can keep the default Data Table. Further down, I’ll show you how the output looks when you use the different types.

Now that you’ve entered the query, the machine to run the query on, and the format of the query results, you need to specify where the data will go.

The Destination Type property specifies if the destination is a variable or a file. For this example, keep the default and save the data to a file.

To specify the file, click inside the Destination property’s text field and click on its ellipses button. Just like the WmiConnection property, click on New Connection to open the File Connection Manager Editor.

alt text here…

Change the Usage type to Create file, and specify where you want the file created.

alt text here…

Our final step is to set the OverwriteDestination property. This property controls what happens if your destination file already has data when you run a query. You can choose to:

Type Description Keep original Keeps the destination file’s data untouched Append to destination Appends your new data to the file Overwrite destination Overwrite the file with your new data Use the Keep original default.

With those steps done, the WMI Options are set as follows:

alt text here…

You can now press OK to close the dialog and return to the designer. The task’s red X icon is gone since it now has all the information it needs.

If you run this task in BIDS, by pressing F5, you should see it turn green. Furthermore, if you navigate to C:WMI Data, you should see a data.csv file. If you open it in notepad, you’ll see something like:

alt text here…

Note that if you try to run this task again, you’ll get an error since C:WMI Datadata.csv already exists. To prevent that, you’ll need to change the OverwriteDestination value to append or overwrite.

If you re-run this task with the ‘Property name and value’ output format, the generated file contains:

alt text here…

If you select just Property Value, the contents are:

alt text here…

Using Mist:

Now let’s see how to do the above in Mist.

To start, you’ll need to create a package. If you’re unfamiliar with how to create a package, you can follow the first couple steps in the Mist User Guide’s Creating a Package topic.

Once your package is open, navigate to the Toolbox tool window.

alt text here…

Select and drag the WMI Data Reader task onto the Package’s design surface.

alt text here…

This task has a red X icon, just like in BIDS, since it has errors.

alt text here…

However, you’ll notice that unlike BIDS, we list all errors upfront, along with recommendations for fixing them.

Inside the Package Details tool window, you’ll find the same properties as the WMI Data Reader dialog in BIDS.

alt text here…

Our first step is to enter the WQL query. In the Source area, there is the Method dropdown, which lets you set the task’s WqlQuerySourceType property. It’s already set to the default value of Direct. Beneath the drop down is a text area, where you can enter the query. Afterwards, update the Overwrite Destination property to Overwrite.

alt text here…

The next step is to set the two connections you’ll need.

If you click on the WMI Connection dropdown, it only contains a value for (No Selection). Thus, you’ll need to add a WMI Connection to the project. You can follow the steps in the Mist User Guide’s Create New Connection section to create a connection, although you’ll want to make a WMI connection rather than an Ole DB connection.

The designer for the WMI connection has two red bordered fields, clearly indicating what you need to enter.

alt text here…

From the BIDS sample, provide an appropriate namespace and server name, and set Use Windows Authentication to true.

alt text here…

Next, create a File connection and open its designer.

alt text here…

Matching the BIDS sample, switch the File Usage type to Create File. Clearly, you also need to enter a File Path. You can also rename this connection in case you want to use it elsewhere.

alt text here…

Now, reopen the package and select the WMI Data Reader task to view its details again. In the WMI Connection combo box, you can now select WmiConnection1. In the Destination’s File Connection combo box, you can select WMIDataCsvConnection.

alt text here…

And that’s it. Your task is done and ready to run. To test this package, let’s run it using DTEXEC. To do that, right click on the package in the logical view.

alt text here…

Select the Build & Run menu item. This menu item builds the Mist package, which produces a DTSX package that DTEXEC will execute, just like in BIDS.

Next Time:

In my next post, I’ll look at a sample using the WMI Data Reader task and a related task.

Links:

WMI Data Reader documentation for the BIML language

WMI Data Reader task – MSDN

WQL Keywords

WQL Operators

WMI Win32 classes

Craig

Thumbnail

How To - Tasks and Transformations : Send Mail

Written by Varigence Blog on 4.5.2011

In my last post, I discussed the basics of the Execute SQL task, and how to build it in both BIDS and Mist. Today, I’m going to discuss a more straightforward task – Send Mail – and show how you might use it with Execute SQL to create a simple workflow.

Background:

It should come as no surprise that the Send Mail task is used to send messages. A common use case is notifying a user about the success or failure of tasks in your workflow.

The properties that are unique to the Send Mail task align with email fields:

  1. To
  2. BCc
  3. Cc
  4. From
  5. Priority
  6. Subject
  7. Attachments

Using BIDS:

To add a Send Mail task to a package in BIDS, first open the Toolbox tool window and find the Send Mail task.

alt text here…

Drag and drop the task onto your package’s design surface. You’ll notice that the task has a red X icon, indicating that there are errors.

alt text here…

If you open the error list, you’ll see some errors and warnings pertaining to the task.

alt text here…

You can begin tackling these errors by double clicking on the Send Mail task in the designer. This opens up a modal dialog that lets you set various properties on the task. The above errors all pertain to missing or invalid properties on the Send Mail task. We’ll need to edit the Send Mail task’s properties to correct the issues.

alt text here…

The From, To, Cc, BCc, and Subject properties are all strings so you can just enter text directly. BIDS will check that address lines have properly formatted strings.

For example:

alt text here…

The actual email message’s content can come from different sources. The MessageSourceType property indicates the type of source; its dropdown allows you to choose from:

Type Description Direct Input Enter text directly into a property File Connection Specifies a file that contains the content. The file may already exist or may be created at runtime. This may be used if you want the mail to include the contents of a log file. Variable Create or choose a variable that contains the content.

For the Direct Input scenario, you can enter text directly in the MessageSource field. While doing this, try changing the message’s priority to High so it gets Bob’s attention.

alt text here…

With the basic email ready, you now need to provide a SMTP Connection so that when the package runs, it can actually send this email. In BIDS, clicking on the dropdown button in the SmtpConnection field allows you to create a new connection by opening the STMP Connection Manager Editor.

alt text here…

The important field here is the SMTP server field. For this example, I entered a phony SMTP server; naturally, you’d enter a real one.

Finally, you can add an attachment. If you click inside the Attachments field, you’ll see a button with an ellipsis.

alt text here…

Click on the button to display an Open file dialog box, which allows you to select attachments for your mail. If you want to attach multiple files, you’ll need to open each one separately via the Open file dialog box. BIDS will add a | character between each file path. Note that this method of referencing files differs from elsewhere in SSIS, where connection managers are used instead.

alt text here…

With that, you can press OK in the Send Mail Task Editor dialog to return to the designer. You’ll see that the red X icon is gone since the task now has all the information it needs.

Now that you’re done with setup, you still need to know when to run the task. As mentioned at the beginning of this post, a typical use of Send Mail is to notify a user when a task succeeds or fails. For instance, if you had an Execute SQL Task that failed, you might want to tell someone.

For this example, imagine that you have the Execute SQL task that was built in the previous blog post. You can connect that task to the Send Mail task, so that the result of the Execute SQL task controls whether the Send Mail task runs. To do that:

  1. Select the Execute SQL Task
  2. Drag its arrow from the Execute SQL Task to the Send Mail task
  3. Double click on the arrow

alt text here…

When you double click on the arrow, the Precedence Constraint editor will open. This dialog lets you control when the Send Mail task runs. In this case, if the Execute SQL Task fails (notice that the Value field is set to Failure), the workflow will follow your arrow by executing the Send Mail task.

alt text here…

Using Mist:

Now, I’m going to show you how to create the same Send Mail task and workflow in Mist.

To start, you’ll need to create a package. If you’re unfamiliar with how to create a package, you can follow the first couple steps in the Mist User Guide’s Creating a Package topic.

Double click on your package to open it in the designer. The designer will be empty since nothing has been added to the package yet.

alt text here…

Now is a good time to locate the Package Details and Toolbox tool windows. The Package Details window is typically located in the lower portion of the Mist window.

alt text here…

The Toolbox tool window is typically in the lower right portion of the Mist window.

alt text here…

Once you locate the Toolbox tool window, select and drag the Send Mail task onto the Package’s design surface.

alt text here…

Just as in BIDS, there’s a red X icon to indicate that the Send Mail task has errors. And just like in BIDS, the error list window lists the errors pertaining to this task

alt text here…

One thing you may have noticed when you dropped the Send Mail task onto the package is that the Package Details area is immediately updated to display the Send Mail task’s properties.

alt text here…

Just like in the BIDS dialog box, you can fill in addresses for From, To, Bcc, and Cc., along with a subject line.

alt text here…

If you click on the dropdown for the SMTP Connection,you’ll see that it only contains a (No Selection) entry, meaning that you need to add a new SMTP connection to your project.

To do that, you can follow the steps in the Mist User Guide’s Create New Connection section to create a connection, although you’ll want to select a SMTP connection type.

alt text here…

Once created, select the new SMTP Connection in the Logical View and switch to the Properties tab to see its values.

alt text here…

alt text here…

You’ll see the Smtp Server property has a red border, indicating that leaving this property blank produces an error. To fix the issue, you can enter smtp.example.com. This causes the red border to disappear, indicating that the error has been fixed.

alt text here…

Mist is all about reusing assets whenever possible. This SMTP connection can be used for every task in your project. Additionally, you can edit the connection in one place if you need to change it in the future.

To simulate the BIDS workflow, drag and drop an Execute SQL task from the Toolbox onto the design surface. Then drag a line from the bottom anchor of the Execute SQL task to the top anchor of the Send Mail task.

alt text here…

Once done, you’ll end up with an edge between the tasks. If you select the edge, the Package Details area updates to display precedence constraints in the workflow.

alt text here…

alt text here…

Notice that there’s a Data Grid that specifies the actual task whose output will control the execution of the Send Mail task. This is especially useful when a task has several inputs. As before in BIDS, you can choose an appropriate Evaluation Operation and an Evaluation value.

Alternatives:

For reporting errors on a whole container or package, you would typically take a different approach than the one we’ve outlined. You might be tempted to attach an OnError event handler to the container and, within the handler, add your Send Mail task. However, this approach is not recommended as it can result in being bombarded by error emails. Check out this link for a suitable way to handle this scenario.

Summary:

This concludes my look at the Send Mail task, and building a simple workflow with the Execute SQL task. Next time, I’m going to review another workflow task.

Links:

Send Mail documentation for the BIML language

Send Mail task - MSDN

Craig

Thumbnail

How To - Tasks and Transformations : Execute SQL

Written by Varigence Blog on 3.29.2011

While developing Mist, we found that there’s little in the way of online resources for learning how to use tasks and transformations in your workflows. While you can find explanations of tasks and transformations, examples aren’t as common for most of them.

Additionally, our experience with BIDS is that editing packages can be cumbersome. One goal for Mist was to make package workflows faster and easier to author.

Thus, this begins a series of blog posts that demonstrate how to use the over sixty various tasks and transformations in your packages, with examples in both BIDS and Mist.

Background:

For our inaugural post, I’m going to discuss the Execute SQL task. You might be wondering why I’m starting with a task that already has a ton of articles on it. Well, there’s a reason it warrants a lot of coverage; it’s a complex task. Additionally, it’s a great starting point for seeing with Mist can do and how it improves on BIDS. I’ll also be referring to this task in upcoming posts in this series.

The task’s purpose is to allow you to run SQL statements or stored procedures within a package’s workflow. You can capture values from the execution of the SQL query to be used later in the package. Some of the operations you might perform with the Execute SQL task include:

  • Creating tables and views
  • Returning a rowset from a query and saving it in a variable
  • Executing a stored procedure
  • The example I’m going to show you how to build will run a query on the AdventureWorksOLTP database, with that query stored in a SSIS variable and its result placed in another variable.
Using BIDS:

The first step is to add the Execute SQL task to a BIDS package. To do that, find the Execute SQL task in the Toolbox tool window.

alt text here…

Then drag and drop the task onto your package’s design surface. You’ll see a red X icon in the task, indicating that there are errors.

alt text here…

Opening the BIDS Error List tool window shows the actual errors pertaining to this task.

alt text here…

When you’re finished, there should be no errors on the Execute SQL Task, and the red X icon should be gone from the task.

Double click on the Execute SQL task to begin setting it up. A dialog box will open that lets you set properties on the task.

alt text here…

You can start by configuring where the SQL statement will run. Find the ConnectionType property in the SQL Statement group. Click inside the ConnectionType property field, and click on the down arrow to display its dropdown.

alt text here…

This dropdown lists the different connection types you can use. Since AdventureWorks is a SQL Server database, you’ll want to stick with the default, OLE DB connection type.

Beneath the ConnectionType property is the Connection property. If you click inside the Connection property field, you’ll again see a down arrow. Clicking on it displays a pop-up.

alt text here…

Select the New connection… item to bring up the OLE DB Connection Manager dialog box.

alt text here…

This dialog lets you create and manage data connections. For instance, if you already had a connection to Adventure Works, you could simply select it from the Data connections list and press OK. However, since there aren’t any connections, you’ll need to create one by pressing the New button, which greets you with a Connection Manager dialog box.

alt text here…

If you’ve installed Adventure Works on your local machine, you can type in your machine name in the Server name text field. Or you can simply enter a period to represent your local machine. Pressing the down arrow in the Server name text field causes the dialog to scan the network for SQL instances, which may take several minutes in some environments.

alt text here…

Once you enter a server name, you’ll notice that the radio buttons, in the Connect to a database groupbox, become enabled.

alt text here…

Select the AdventureWorks database from the enabled dropdown.

alt text here…

Finally, press OK to finish creating the connection. This will return to you to the Connection Manager dialog, which now lists the new connection.

alt text here…

Press OK again to set this data connection on the Execute SQL task.

alt text here…

Next up is the SQLSourceType property. This property specifies where the SQL statement comes from. The dropdown allows you to choose from three options:

Name Description Direct Input Enter text directly into a property File Connection Specify a file that contains the SQL statement. The file may already exist or may be created at runtime. Variable Create or choose a variable that contains the SQL statement. Select Variable from the SQLSourceType dropdown.

alt text here…

Notice that the SQLStatement property, directly underneath SQLSourceType, has been renamed to SourceVariable. Opening its dropdown lists all the built-in variables available to us.

alt text here…

However, the dropdown also includes a New variable… item at the top. Select that item to open the Add Variable dialog box.

alt text here…

This dialog allows you to set various properties on your new variable:

Name Description Container Indicates the scope of the variable. By selecting Package1, this variable becomes available to all containers, tasks, and event handlers within the package. Name Specifies a name for the variable Namespace SSIS provides two namespaces, System and User. The System namespace contains system variables and the User namespace contains custom variables. However, you are also allowed to create additional namespaces. Value type Specifies the type of literal that the variable stores. Note that the variable’s value must be of the value type. For example, having a variable value of abc with a value type of DateTime won’t work. Value The value that the variable stores.

In order to make the purpose of this variable clearer, rename it to SqlQuery. You can leave the remaining properties as-is.

alt text here…

Press OK to return to the properties dialog.

The last property in the SQL Statement group is BypassPrepare. The Execute SQL task can run the prepare command on a SQL statement before it executes on our specified connection. The BypassPrepare property makes it possible to skip that preparation. This is typically used when a SQL statement includes a keyword that the prepare command doesn’t support.

alt text here…

You’ve now established where the SQL statement will run and identified where the SQL statement will be stored. The next step is to actually write the SQL statement. To bring up the Query Builder dialog, click on the Build Query… button in the Execute SQL Task Editor dialog.

alt text here…

For this example, you can query your product inventory for items where the quantity on hand is less than 100. A query like this might be useful to inform someone that they need to increase production of an item. The most direct way to enter a query is to type it in the textbox where SELECT and FROM are already present.

alt text here…

After entering the query, if you click in the white space area above the data grid, you’ll see that:

  1. The query text is reformatted
  2. The data grid is populated with values that match my query
  3. A visual representation of the table and the column selections appears at the top of the dialog

alt text here…

By pressing the Run button in the Query Builder’s toolbar, you can have the dialog execute your query so you can see the results.

alt text here…

alt text here…

This is great for confirming that your query returns the results you expect.

Now, press OK to save the query in the variable.

Returning to the Execute SQL Task Editor dialog, it’s time to discuss the ResultSet property. This property lets you control the format of the results from the executed query. The property value is a dropdown that lets us choose from:

Type When To Use None When the query returns no results, such as a statement to delete records from a table. Single Row When the query returns one row, such as a SELECT statement that returns a count. Full result set When the query returns more than one row, such as a SELECT statement that retrieves all the rows in a table. XML When the query returns a result set in an XML format.

In this example, it’s clearly possible to return multiple rows. Thus, you should select Full result set from the dropdown.

alt text here…

The final step is to define where the result set produced by the SQL query will be stored in the package. To do that, switch from the dialog’s General view to the Result Set view. Below, you can see that Result Set is selected in the left list box.

alt text here…

To add a Result Set mapping, click the Add button, which will add a default mapping to the list.

alt text here…

To provide a clearer name, change the result name to ‘ExecuteSqlResult’ and create a new variable named Result to store the result. Note that when creating this variable, its value type needs to be Object.

alt text here…

Unfortunately, if you try to run this task, you’ll hit an error that “The result binding name must be set to zero for full result set and XML results.” It turns out that there are specific rules about result set names when using various result set types. Microsoft has a webpage that lists these rules. In our case, we need to rename our Result to the number 0.

alt text here…

With that change, the Execute SQL task can be run successfully.

Using Mist:

Now that you’ve created the Execute SQL task in BIDS, let’s review how to create it in Mist. As I demonstrate the process, you’ll notice that the procedure for creating tasks will be familiar to BIDS users, but with some differences that help streamline the process of editing tasks.

To begin, you’ll need to create a package. If you’re unfamiliar with how to create a package, you can follow the first couple steps in the Mist User Guide’s Creating a Package topic.

Once your package is open, navigate to the Toolbox tool window.

alt text here…

Drag and drop the Execute SQL task onto the Package’s design surface.

alt text here…

The task has a red X icon, just like in BIDS, since it has errors.

Furthermore, the Error List tool window lists all the errors for the Execute SQL task upfront.

alt text here…

You’ll notice that these errors are much clearer than BIDS. Additionally, Mist lists all errors with the task upfront, unlike BIDS, which initially missed that SqlCommand was blank.

The next step is to display the Package Details tool window. The Package Details window is typically located in the lower portion of the Mist window.

alt text here…

If you can’t find the Package Details tab, you can make the window appear by switching to the Home ribbon tab and clicking on the View popup button.

alt text here…

In the list, click on the Package Details item to make the window appear.

The Package Details area displays settings for the selected task or transformation in the package. If the Execute SQL task isn’t selected, click on it in the designer to select it. The Package Details area will now be populated with properties for the task.

alt text here…

These properties should look familiar, since they’re the same properties you saw in the BIDS Execute SQL Task Editor dialog. However, you’ll notice that Mist is not using a modal dialog. In Mist, we’ve avoided dialog boxes as much as possible. Additionally, using tool windows lets you easily switch between tasks or make changes on the fly, instead of having to dismiss several dialogs and reopen them later.

Following the steps in the BIDS example, the first step is to create a connection to Adventure Works. To do that, you can follow the steps in the Mist User Guide’s Create New Connection section to create a connection, although you’ll want the Initial Catalog to be AdventureWorks, not AdventureWorks LT. It’s also worth reemphasizing that this approach allows you to easily find and reuse this connection elsewhere in your Mist project, including other packages

To associate this connection with the Execute SQL task, reopen the Package by double clicking on it. Then click on the Properties tab to display the Properties window. The Properties window is typically located in the lower right portion of the Mist window. If you can’t find it, you can use the aforementioned View button dropdown to make it appear.

alt text here…

You’ll see that the Connection property dropdown has a red border surrounding it. The red border makes it clear that this property needs to be set, and you’ll experience a build error until this is fixed. You can fix it by clicking on the dropdown’s arrow and selecting the AdventureWorks connection.

alt text here…

The next step is to write the SQL query. To do that, go back to the Package Details tab and look at the T-SQL Editor.

alt text here…

This control is designed to let you enter a T-SQL query, providing syntax highlighting and Intelliprompt along the way. You’ll notice that the Connection dropdown reads AdventureWorks; this dropdown is linked to the Execute SQL Task’s Connection property. The green icon to its right means that we’ve established a connection to the database, thereby enabling Intelliprompt.

As you type in the query, the Intelliprompt popup will offer you possible values. And those possibilities will be filtered as you continue typing.

alt text here…

Once you’ve entered the query, you can press the Preview Query button to execute the query and retrieve the first 100 results. Just like in BIDS, this lets you confirm that your query returns the results you expect.

alt text here…

Now that you’ve authored the query and know it’s valid, you need to move it into a variable and associate that variable with this task.

To start, select the SQL query and press Ctrl+C to copy it to the clipboard.

To create a package variable, go into the package designer surface and click anywhere in the white area. This causes the package to be selected. You may have noticed that the ribbon’s contents changed a bit when you did this. While all tasks and transformations have a common set of assets, the ribbon also lets you add items specific to the selected task or transformation. For instance, when the Execute SQL task is selected, the ribbon appears as:

alt text here…

Notice that the ‘Task Specific’ group, at the right, contains Parameter and Result buttons. Selecting the package changes the ribbon’s contents to:

alt text here…

The Parameter and Result buttons have been replaced with Connection Reference and Package Configuration buttons. Note that the first six buttons haven’t changed since they’re common to all tasks and transformations.

With the Package selected, click on the Variable button in the ribbon. This will add a variable to the package. Additionally, the Variable tool window will be brought to the front of the window.

alt text here…

The newly created Variable is already selected. Matching the BIDS sample, rename this variable to SqlQuery and set its namespace to User. To change its name, you can double click on its default name in the Variable column and type in the new name.

alt text here…

To store the query in this variable, double click on the Value cell and press Ctrl+V to paste in the query.

To associate the variable with the Execute SQL task, select the Execute SQL task in the designer. In the Package Details area, find the Method dropdown. This dropdown parallels the SqlSourceType dropdown in BIDS. Click on the dropdown to select Variable.

alt text here…

You’ll then see that the T-SQL editor vanishes and is replaced with a Variable dropdown. Don’t worry – if you want to test the query again or try another query, you can switch back by changing the Method value back to Direct.

In the Variable dropdown, scroll down to the bottom and select the User::SqlQuery variable.

alt text here…

To finish this up, you need to create a Result Set. First, go to the Result Set dropdown, directly underneath the Name textbox and select Full.

Next, in the ribbon, click on the Result button.

alt text here…

When you do that, you’ll see that the Result item was added to the Results data grid in the Package Details area.

alt text here…

Now you need to create another variable, using the same steps as before, named User::Result. Recall that the variable’s Type needs to be Object.

alt text here…

With that done, you can double click on the Result’s Variable cell to open its popup and select the User::Result variable.

alt text here…

Finally, change the Result’s Name to 0.

And with that, you’ve built a matching Execute SQL task in Mist. If you build your package, following the steps in the Mist User Guide’s Build the Example Project topic, you can open the Package1.dtproj in BIDS to see that it matches the sample you manually built in BIDS and also runs successfully.

Summary:

This concludes our first look at Mist’s package editing capabilities, along with how to construct an Execute SQL task. Next time, I'll talk about what you might do in your workflow if the Execute SQL task fails during execution.

Links:

Execute SQL documentation for the BIML language

Execute SQL task – MSDN

Craig

Thumbnail

Time Functions with Date Intelligence

Written by Varigence Blog on 3.24.2011

What is Date Intelligence?

Date Intelligence in Vivid is not a single feature, but rather a suite of related features. To get the most out of the BI data, analysts need to be able to query their cubes to obtain the dataset from specific points in time. Relevant time periods are highly variable between industries and ever-changing within companies. Our Time Functions are designed to make date ranges easier to work with.

The Time Functions

Vivid’s Time Functions make it simple for users to create Excel calculations that perform sophisticated date/time manipulation. To access the Time Functions, simply right click the name of the measure from the pivot table overlay that you wish to analyze over a period, and then select “Time Functions:”. From there, select the Time Function you’d like to invoke: Periods to Date, Parallel Periods, or Rolling Results. The final result is a new calculation that can be placed in the pivot table. These are native Excel calculations, fully functional for your team members who don’t yet use Vivid. Additionally, these calculations can be used in the formula of other custom calculations you create with Vivid.

https://varigencecomstaging.blob.core.windows.net/blogimages/TimeFunctions.png

Periods to Date

For a given measure, Periods to Date will create a calculation that will compute a roll-up for the specified period to the located date in the pivot table. Here is a pivot table that has a measure (Internet Order Count) and a calculation which is the Periods to Date version of this measure. The period used for this example is Fiscal Quarter:

https://varigencecomstaging.blob.core.windows.net/blogimages/PeriodsToDate.png

From the screenshot above we can observe the following:

We see blanks for Parallel Inter net Order Count for two distinct reasons: - Since we’re lagging one quarter, periods that are of larger granularity don’t map to lagging by a quarter. For example, there is simple notion of a Fiscal Year that lags the current Fiscal Year by a quarter. - Q1 FY 2002 is blank and this is because there is no data prior to that quarter in the cube. - Regardless of which row you’re looking at, the Parallel Periods calculation is always lagged by one quarter. It doesn’t change on a row by row basis.

The underlying MDX created for this query looks like this:
([Measures].[Internet Order Count], ParallelPeriod([Date].[Fiscal].[Fiscal Quarter], 1, [Date].[Fiscal].CurrentMember))
Rolling Result

Rolling results will compute the rolling aggregation for a specified measure. Rolling results asks you to specify which hierarchy to roll against, most commonly a hierarchy from a time dimension. Additionally you specify the number of periods to roll-up. Like Periods To Date, you can also specify your aggregation operator. In the below example we use the SUM aggregation and set the Rolling Result:

https://varigencecomstaging.blob.core.windows.net/blogimages/RollingResult.png

Notice that Q1 FY 2002 is the same for both Internet Order Count and RR Internet Order Count, that’s because there’s no previous quarter to sum in the roll-up. - The rolling calculations is set to roll the past two periods for whichever level they’re at. Here are some examples: - Q2 FY 2002 is the sum of 565 and 448, the current and previous quarter. - 2003 is the sum of 3222 and 2206, the current and previous years. - August 3, 2002 is the sum of 6 and 8, the current and previous days. - The underlying MDX created for this query looks like this:

Sum(LastPeriods(2, [Date].[Fiscal].CurrentMember), [Measures].[Internet Order Count])
Editing Calculations

While the Vivid UI can create these custom Date Intelligence calculations, you can also tweak these calculations using MDX. Simply right-click the calculation in the Pivot Table Editor and select “Edit…” from the context menu. In the Edit Calculations dialog you can modify the calculation to do exactly what you want.

Feedback

Please feel free to give us feedback on our forums regarding your use of these Time Functions, as well as suggestions for any additional Time Functions that you’d find useful.

Kang Su Gatlin

Thumbnail

Overriding a WPF binding value without bindings

Written by Varigence Blog on 3.11.2011

First off, let me briefly introduce myself. My name is Craig and I'm the lead developer on Mist. My blog posts will typically be focused on developing Mist and developing BI solutions in Mist.

I recently encountered a situation where I needed to override a binding's value without changing the binding. If that statement seems confusing to you, allow me to back up and explain further.

Background:

All of the designers in Mist are implemented as WPF Pages. Within each designer is a variety of controls, e.g., ComboBoxes, DataGrids, CheckBoxes. Many of these controls have their IsEnabled bindings set in XAML.

A scenario arose where we wanted the user to be able to view an editor's content, but not edit it. Sort of a 'look but don't touch' behavior. At first glance, this seemed like a trivial problem; simply add or alter IsEnabled bindings to detect when to disable each control. The problem I faced was that each designer has lots of controls, all of which would need to implement this behavior. Furthermore, many of them already have IsEnabled bindings. I really didn't want to deal with the time consuming and cumbersome choice of adding or editing each control's IsEnabled binding. I also wanted to avoid the maintenance cost of ensuring that all future changes account for this new IsEnabled logic.

Solution:

Some of you may already be thinking that the right way to solve this is to use default styles. And you're right. (For those of you unfamiliar with default styles, check out MSDN Default (Theme) Styles and view a simple example at Inheriting Default Styles in WPF). But the problem remains of how to change the default style so we can override the control's IsEnabled value while still preserving its binding.

Your title here...The way I chose to solve this was by using an attached property:
public static class ControlExtension
{
    public static readonly DependencyProperty IsReadOnlyProperty =
        DependencyProperty.RegisterAttached(
            "IsReadOnly",
            typeof(bool),
            typeof(ControlExtension),
            new PropertyMetadata(false, new PropertyChangedCallback(OnIsReadOnlyChanged)));
 
    public static void SetIsReadOnly(DependencyObject element, string value)
    {
        element.SetValue(IsReadOnlyProperty, value);
    }
 
    public static object GetIsReadOnly(DependencyObject element)
    {
        return element.GetValue(IsReadOnlyProperty);
    }
 
    private static void OnIsReadOnlyChanged(DependencyObject d, DependencyPropertyChangedEventArgs e)
    {
        if (e.NewValue is bool)
        {
            if ((bool)e.NewValue)
            {
                d.SetValue(UIElement.IsEnabledProperty, false);
            }
            else
            {
                d.ClearValue(UIElement.IsEnabledProperty);
 
                var bindingExpression = BindingOperations.GetBindingExpression(d, UIElement.IsEnabledProperty);
                if (bindingExpression != null)
                {
                    bindingExpression.UpdateTarget();
                }
            }
        }
    }
}

When IsReadOnly becomes true, we set the DependencyObject's IsEnabled property to false, thus forcibly disabling the control. Note that any IsEnabled bindings on that DependencyObject will be ignored, due to SetValue. If IsReadOnly returns to being false, we first clear the IsEnabled property's value and, if the control has an IsEnabled binding, we then call UpdateTarget to refresh the binding's value.

https://varigencecomstaging.blob.core.windows.net/blogimages/buttonIsReadOnlyStyle.png

To add this behavior in XAML, we can create or edit a default style for each relevant control, setting a binding to its IsReadOnly attached property. Our controls will then inherit their theme styles and use the new behavior.

Craig Lichtenstein

Thumbnail

Why is Vivid an Excel AddIn?

Written by Varigence Blog on 3.9.2011

A new user’s first reaction after using Vivid for a few days is usually “I’ll never use Excel without this again!” This raises the question – why is Vivid a Microsoft Excel add-in rather than a standalone application? Vivid provides enough unique functionality to justify its licensing as a stand-alone product. Here are some reasons why we’ve chosen to develop and market Vivid as an Excel Add-In:

  1. Excel is already the dominant SSAS analysis client. Every BI analyst keeps Excel open on their desk and already knows how to use it. Why add a completely new tool with all new idioms and patterns? We instead build on the Excel idioms and patterns, so that current users feel right at home.

  2. Excel has some very excellent BI capabilities. This hasn't always been the case, but since 2007 Microsoft has greatly increased the BI capacity of Excel. Rather than reinventing the wheel and reimplementing all Excel's functionality, we decided to create a new and unique BI feature set on top of Excel's capabilities. Since we don't have to implement things like drilldown, filters, or slicers, we can focus on adding the features Excel chronically lacks.

  3. Vivid-enhanced Excel projects still work with non-Vivid users. By building ourselves directly into Excel and building on Excel workbooks, users can add value with Vivid, but if they don't have Vivid, the pivot tables created still work. An individual analyst using Vivid, working with a team that still uses vanilla Excel, can share workbooks with no friction. Of course, getting everyone on Vivid has great synergistic benefits – but it’s a choice, not a fixed conversion cost.

  4. Vivid users retain full access to Excel charts. Charting is an important aspect of any BI tool, and again this is an area where Excel has good support. Since we use native Excel pivot tables, Vivid users continue to be able to use all of the capabilities associated with Excel charts.

  5. Vivid works with other Add-Ins. The Excel add-in model allows Vivid to work side by side with other add-ins. So if your organization has investments in other Excel add-ins to improve workflow, Vivid fits right in. There is no concern that you lose functionality provided by an add-in when you decide to use Vivid. With Vivid, users continue to leverage all of their existing investments in Excel.

As an Excel Add-In developer we really take to heart making you more productive, and we've taken the important step of making sure you're productive in your day-to-day tool, Microsoft Excel.

Kang Su Gatlin

Thumbnail

Vivid 1.5 Released from Varigence!

Written by Varigence Blog on 2.11.2011

alt text here…

Vivid is an indispensable tool that brings advanced analytics capabilities directly into Microsoft Excel. The new 1.5 version of Vivid adds powerful new self-service analytics features for both teams and individual analysts.

Vivid: A Recap

Vivid is an add-in that transforms Excel (2007/2010) into a world class self service BI tool with over 60 new features and capabilities. In addition to using Excel pivot tables more effectively with SSAS cubes and PowerPivot workbooks, Vivid offers a rich collaborative commenting model that makes your entire team more effective. Since Vivid builds directly on native Excel Pivot tables, you can seamlessly share your spreadsheets with non-Vivid users, Excel Services, and SharePoint. If you use standalone analytics clients such as ProClarity and wish many of those advanced features were available in Excel, Vivid is the answer. For more information about Vivid, visit its product page.

The Vivid 1.5 Release

Vivid 1.5 adds the following capabilities:

  • PowerPivot support. Use the Vivid Pivot Table Editor to browse and create pivot tables for PowerPivot workbooks, either hosted in Sharepoint or embedded in the Excel document.
  • Slicer support (Excel 2010) directly in the overlay and in the Vivid Pivot Table Editor.
  • Virtual Dimensions, where you can cluster hierarchies. Useful for creating dimensions that don't exist on the actual cube.
  • Virtual Hierarchies, which allow you to create a new pseudo-user hierarchy. These define a drilldown path, and are of particular use for PowerPivot, which doesn't have built-in support for user hierarchies.
  • Several bug fixes and performance improvements.

Vivid 1.5 is available for download today here with a 14-day free trial. Vivid 1.4 customers can upgrade to Vivid 1.5 at no cost.

Thumbnail

Welcome to the Varigence Blog

Written by Varigence Blog on 1.10.2011

You've made it to the Varigence blog, welcome!

The first thing you should know is who we are. Varigence team members live and breathe business intelligence and application development. We were brought together by a vision that business intelligence could be made much more efficient and effective. As a business we currently have one shipping product, Vivid, and another product, Mist, slated for release at the end of January 2011. Varigence also provides consulting and training services to help customers accelerate adoption of our products and frameworks.

The goals of this blog are:

  • Develop content that will enable you to rapidly build better solutions with the Microsoft SQL Server product family.
  • Demonstrate the advantages of building Microsoft SQL Server solutions with the Varigence tools and frameworks.
  • Share discoveries and insights about the Microsoft SQL Server product family that were made during product development.
  • Provide insight into the decisions and motivations used in building products such as Vivid and Mist.
  • Go under the hood on technical aspects of building our business intelligence products.

Here you'll see articles that range from talking about new product releases, tips on how to better use features in our products, general BI and data warehousing development strategies, .NET developer tips, Office developer tips, musing about the industry, and anything else we think would be of value to the readers. Let us know if there is something specific you'd like us to write about.