Federal Reserve Sample Overview

Introduction

The Federal Reserve is the central banking system of the United States. A key function of the Federal Reserve is to implement monetary policy; one way that’s done is through controlling the interest rates of various economic assets. Viewing and comparing the performance of different interest rates can provide insight into the changing state of the United States economy.

Sample Background

This overview is accompanied by a Mist sample that creates a SSAS cube containing Federal Reserve interest rate data. The cube can be used to analyze interest rates among various securities using Excel Reporting Services and Vivid. To create the cube, this sample takes interest rate data, which is inconveniently strewn across numerous hard to read files on the web, and organizes it into consistent dates and a structured format. It also models the necessary tables to hold this data, before creating them in SQL Server and populating them with the data. Finally, the sample models a cube that can be built into a SSAS cube and then deployed for user analysis.

Preparing to Run the Sample

Requirements

The sample assumes that you have the following installed on the same machine where Mist is installed:

  • SQL Server Management Studio 2008 R2 (SSMS)
  • Business Intelligence Development Studio (BIDS)

Downloads

You can download the sample from here.

Reminders

If this is your first use of Mist, you may need to obtain a license key.

When run, this sample will create a database in SSMS named FederalReserveInstruments. If such a database already exists in your SQL Server, we recommend you back it up as it will be overwritten.

This walkthrough assumes you have some familiarity with Mist. If you need help at any point, consult the Mist User Guide.

The Sample Project

With the sample downloaded, extract its folder to your hard drive. Within the folder is the FederalReserveHandsOnLab.mst project file. Double click on the file to open the project in Mist. The Logical View for the project should look like this:

image

If you can't find the Logical View tool window, or any tool window that's referenced in this article, you can use the View ribbon button to open the tool window and bring it to the front.

image

Summary

To build and deploy the FederalReserve cube:

  1. Build & Run the Rebuild Warehouse Schema package.
  2. Build & Run the Rebuild Warehouse Data package.
  3. Build the FederalReserve cube.
  4. Deploy the FederalReserve cube.

Rebuild Warehouse Schema

To get started, you will need to create the FederalReserveInstruments database in SSMS. Once created, the relational tables in the Federal Reserve project can be added to the database. Both of these steps are accomplished in the RebuildWarehouseSchema package. To open the package, double click on the RebuildWarehouseSchema package in the Logical View.

The RebuildWarehouseSchema package consists of seven ExecutePackage tasks.

image

The first task executes the CreateDatabase package, which runs SQL statements to create the database. You can view the ExecutePackage task's properties in the Package Details tool window.

image

The Package Details tool window is available when a package is open; otherwise, it's hidden. By default, it's located along the bottom of the Mist window. If you can't find it, you can use the View button to activate it.

Regarding the remaining tasks, if you select the DimMaturity task and view its properties in the Package Details tool window, you’ll see the following:

image

Notice that despite this being an Execute Package task, the DimMaturity table is selected. This is a unique Biml language capability where the Execute Package task can be used to deploy a table from your project to a SQL Server database. If you relational model changes, you can easily update the database table as part of your workflow. And since you’re selecting a logical asset, you’re not dependent on file paths, meaning you can move your project without needing to update the task.

To run the RebuildWarehouseSchema package, and populate SQL Server, right click on the RebuildWarehouseSchema package and select Build & Run. This builds the package, and its dependencies, and executes the package using DTEXEC.

image

Afterwards, you’ll see the added FederalReserveInstruments database in SSMS.

image

Rebuild Warehouse Data

The next step is to populate the tables with data. The RebuildWarehouseData package handles this by executing three child packages: LoadMetadata, LoadDimDate, and LoadData. To run the package, and populate the tables, right click on the RebuildWarehouseData package and select Build & Run. Note that this may take approximately 15 minutes to run.

Load Metadata

The LoadMetadata package populates the DimFrequency, DimInstrument, DimMaturity, and DimStatus tables in SQL Server.

The DimFrequency table contains values representing event frequencies, such as daily, weekly on every Wednesday, biweekly, monthly, etc…

image

The DimMaturity table holds maturation values for a loan or other finance’s maturity. The maturity increments include: 4-week, 1-month, 6-month, 7-year, etc… , as in a 30-year mortgage or a 7-year bond.

image

The DimInstrument table is the most interesting, in terms of data processing. The LoadMetadata package starts by running a Flat File Source task. This task reads the metadata.txt file that’s included with the sample. metadata.txt is a comma delimited file, where each row represents a location for federal reserve data, which we refer to as an instrument. Each row contains of a:

  • security type (bond, mortgage, etc…)
  • a URL where the data resides
  • granularity (day, month, year, etc…)
  • a display name
  • values for the securities’ frequency, maturity, and status.
image

The file’s data is formatted using a Metadata file format asset that’s part of the project and specified in the Flat File connection. Once processed, three Lookup tasks run, which map the frequency, maturity, and status values in each instrument to ID columns. In SQL Server, these columns will reference the related tables using foreign keys.

Finally, an OLE DB Destination task runs, populating the DimFrequency, DimInstrument, DimMaturity, and DimStatus tables in SQL Server.

Load DimDate

For this sample, the DimDate table needs to be populated with dates ranging from 1/1/1900 to 12/31/2015. Obviously, manually adding each row into the table is not an option; besides taking forever, it’d be a tedious process where mistakes could easily be made. Just imagine repeating this statement over 40,000 times:

                    
                        INSERT INTO
                        [DimDate]([DateID],[FullDate],[CalendarYear],
                        [CalendarYearName],[CalendarSemester],[CalendarSemesterName],
                        [CalendarQuarter],[CalendarQuarterName],[CalendarMonth],
                        [CalendarMonthName],[CalendarWeek],[CalendarWeekName],
                        [DayOfWeek],[IsHoliday],[CalendarDayOfMonth])
                        VALUES(19000101,'1/1/1900',1900,N'1900',1,N'H1 1900',1,N'Q1 1900',1,
                        N'CalendarMonthName',1,N'CalendarWeekName',2,0,0,1)
                    
                

If you asked a .NET software developer how to solve this, they would propose using a for loop to iterate over each date and add it into the dimension. While you could write a program to do that, there are disadvantages. First, you would need additional .NET knowledge to convert the SQL statements you’re already familiar with to .NET equivalents. Second, such a program would be completely detached from your cube’s relational model. Mist solves this by providing the best of both worlds via BimlScript.

The LoadDimDate package contains two Execute SQL tasks that insert rows into the Date dimension, where each row is date. The first task handles years 1900-1979 and the second handles 1980-2015. Each row’s INSERT INTO statement is created using BimlScript, which embeds .NET code into Biml. You can view the BimlScript by clicking on the LoadDimDate.biml document tab, which displays the LoadDimDate package's Biml.

image
                    
                        
                            
                                <# var startDate = new DateTime(1900, 1, 1); #>
                                <# var endDate = new DateTime(1979, 12, 31); #>
                                <# for (var currentDate = startDate; currentDate <= endDate; currentDate = currentDate.AddDays(1)) { #>
                                    INSERT INTO [DimDate]([DateID],[FullDate],[CalendarYear],[CalendarYearName],[CalendarSemester],
                                    [CalendarSemesterName],[CalendarQuarter],[CalendarQuarterName],[CalendarMonth],[CalendarMonthName],
                                    [CalendarWeek],[CalendarWeekName],[DayOfWeek],[IsWeekend],[IsHoliday],[CalendarDayOfMonth]) 
                                    VALUES (
                                        <#=string.Format("{0}{1:0#}{2:0#}", currentDate.Year, currentDate.Month, currentDate.Day)#>,
                                        '<#=currentDate.ToShortDateString()#>',
                                        <#=currentDate.Year#>,
                                        N'<#=currentDate.Year.ToString()#>',
                                        <#=((currentDate.Month - 1) / 6) + 1#>,
                                        N'<#=String.Format("H{0} {1}", ((currentDate.Month - 1) / 6) + 1, currentDate.Year)#>',
                                        <#=((currentDate.Month - 1) / 3) + 1#>,
                                        N'<#=string.Format("Q{0} {1}", ((currentDate.Month - 1) / 3) + 1, currentDate.Year)#>',
                                        <#=currentDate.Month#>,
                                        N'CalendarMonthName',
                                        1,    
                                        N'CalendarWeekName',
                                        <#=(int)currentDate.DayOfWeek + 1#>,
                                        <#=(int)currentDate.DayOfWeek == 0 || (int)currentDate.DayOfWeek == 6 ? 1 : 0#>,
                                        0,
                                        <#=currentDate.Day#>)
                                <# } #>
                            
                        
                    
                

In the task, a for loop is used to iterate over each date in a range that starts with the startDate variable (set to 1/1/1900) and ends with the endDate variable (set to 12/31/1979). When the task is executed, the for loop runs, generating INSERT INTO statements for each date and then executing the statements. This becomes even more powerful when you change the startDate or endDate and simply rerun the package to update your database table. And with little .NET knowledge required, the cost of writing BimlScript is small.

Load Data

The LoadData package does the final piece of initialization, filling in the FactRate table. This is accomplished by iterating over each row in DimInstrument using a Foreach Ado loop. Within the loop, a Script task downloads federal reserve data from each instrument’s URL.

If you select the Script task and open the Package Details tool window, you’ll see that like BIDS, the Script task references a Script Project. However, unlike BIDS, if you double click on the Script Project in the Logical View, you’ll see that the script project opens within Mist, and offers syntax highlighting, quick info popups, and Intelliprompt. Mist’s integrated Script Project editor lets you add variables, resources, settings, references, and additional files; it provides convenience with no loss of functionality.

image

After a file is downloaded, the data in the file is processed using a series of transforms. The final result looks like:

image

FederalReserve Cube

With your relational model and data deployed to SQL Server, you can now build the entire project to create your cube. Select the ribbon's Build & Deploy tab and press the Build button. This builds the entire project, placing the emitted assets in the project’s output directory.

To deploy the cube, you will need to use BIDS. However, as all the emitted assets are compatible with Microsoft BI tools, it’ll be easy. Navigate to the project’s output\SSAS directory and double click on Analysis.dwproj to open the project in BIDS. From BIDS, you can run the Deploy Analysis command in the Build menu to deploy your cube.

To confirm your cube works, open Excel. In the Data tab, open the From Other Sources dropdown button and select From Analysis Services.

image

Enter your server name in the dialog’s Server Name field, and select the Analysis database. You should see the FederalReserve cube. You can now finish the dialog to import the data into Excel and build a report.

Conclusion

Mist and Biml provide powerful tools to easily and quickly design and populate your relational model. Improving upon BIDS, Mist makes cube preparation, design, and development much faster and less error prone, enabling you to be efficient and successful.