Beer, Cheese and SQL Saturday 118 Madison

This past weekend I (w|t) headed up to Madison, WI to attend and present at SQL Saturday 118. The MADPASS team did a fantastic job on their first SQL Saturday.

Attendance was excellent with 227 total consisting of locals and groups from Minnesota, Iowa, Missouri, Indiana, Illinois, Michigan, Nebraska and other places far flung.

I drove up early Friday and worked remotely for the day to make sure that I didn’t miss the speaker dinner.

The fun started early as I met Jes Borland (b|t), Erin Stellato (b|t) and Joey D’Antoni (b|t) for lunch. Jes and Erin arrived “fresh” from a 15 mile run with hearty appetites. Cheeseburgers, grilled cheese and a beef melt were the fare. No beer at lunch, I had more work to do.

After work and settling into my accommodations, I headed over to the venue to give a hand with bag stuffing. Wow, the MATC was bigger than it looked. By the time I got to the room where bag stuffing was taking place, all the work was done. I developed quite a thirst trying to find that room. Fortunately we headed over to the Ale Asylum for a pre-speaker dinner beverage. I had a Big Slick Stout along with some good conversation. That stout was awesome. I would’ve liked to try some of Ale Asylum’s other beers, but we had to move on.

The speaker dinner was at a local Italian place. More good conversation and another excellent stout. I chose the Lilja’s Sasquatch Stout from Sand Creek Brewing Company, a big and powerful beer. It went very well with the food (cheese) and the fantastic chocolate that Bill Fellows (b|t) shared with us. The highlight of the the dinner was the impromptu paper airplane fight.

After the speaker dinner, a group of us took over the hotel bar. Geek rule. More good conversation and Capitol Brewery’s Maibock.

Saturday morning came quickly and since I was on at 9:30, I headed over to MATC early to grab a bite and park in the room where my session was scheduled. Mike Donnelly (b|t) was there giving his first SQL Saturday presentation ever, SSIS: Figuring Out Configuring. I listened as I was configuring the VM for my presentation. He did a great job.

My presentation, SSIS 2012 New Features, was next. Attendance was very good. Thank you to all who attended. Audience participation was also good with lots of questions as well as valuable additions from fellow speakers Bill Fellows, Mark Vaillancourt (b|t) and Aaron Lowe (b|t).

With my speaker duties fulfilled, I headed over to Stacia Misner’s (b|t) presentation, (Way To Much) Fun with Reporting Services. Stacia’s daughter Erika Bakse (b|t) was also assisting with the presentation. The banter between these two made for a very entertaining session. They covered the implementation of Words with Friends in RSS. Let’s see Connect 4 implemented with a play the computer mode using set based algorithms.

After lunch I attend Eddie Wuerch’s (b|t) session, TempDB Performance Troubleshooting and Optomizing. I’ve talked with Eddie at the previous Chicago and Indianapolis SQL Saturdays, but I’ve never attend one of his sessions. This was an excellent session with lots of myth busting and fairly deep insights into tempdb behaviour. He did a great job explaining this all in a simple manner.

I then attended fellow Chicagoan Luke Jian’s (b|t) session, Optimizing SQL Server I/O with Solid State Drives. Lots of good information on SSD hardware. I was surprised by the light attendance in this session. Are SSDs not a hot topic? You missed a good one here. You can catch Luke and this session at SQL Saturday 119 in Chicago

The last session I attended was Zach Mattson’s (b|t), A PowerShell Driven Life. Zach mixed up PowerShell script walk throughs with reviews of a pair of beers, one bottled (good) and one canned (probably bad). I picked up a few new PowerShell tricks and a few recommendations on beers to avoid. Good info and good fun!

A great day and I am definitely looking forward to the next SQL Saturday in Madison.

Congratulations to MADPASS for running a stellar event.

SQL Saturday 98, Iowa City

Calling all SQL peeps in Chicago, Milwaukee, St. Louis, Minneapolis, Omaha,…!

It’s time for another SQL Saturday at the beautiful University of Iowa campus.

The speaker lineup looks great.

I’ll be there presenting on the SSIS Script Task.

There will be more SSIS goodness presented by Ted Krueger and Ross McNeely.

There’s still time to register and take a nice fall SQL road trip this Saturday.

See you there.

PowerShell, the New SQL Hammer

This is my PowerShell presentation that I have given to both Chicago SQL Server User Groups. The name of the presentation is having some fun with the old quote, “If you have a hammer, everything is a nail”; which allows me use a real hammer in the presentation to good effect. I have received positive feedback from the people that attended and I even had someone stop me in the hall at SQL Saturday Chicago to say that it helped them use PowerShell to be the hero. It is a powerful hammer indeed.

I attended various PowerShell presentations and I just was not “getting it”. The presentation chronicles my journey trying to understand how all the bits and pieces of PowerShell work. It shows how to dig into things a bit deeper to help use PowerShell and the SQL Server integration provided with SQL Server 2008 more effectively.

I will be giving the presentation at other user groups in the Midwest and I have submitted it to nearby SQL Saturdays.

The slides and scripts are available here for download.

Thanks to Aaron Nelson (Blog|Twitter) and Nic Cain (Blog|Twitter) for their help with script reviews, feedback and inspiration.

SQL Saturday 67 Chicago Pre-con Double Down

Yes, this looks familiar. Didn’t I write this last week? Well it’s new and the same. We have another pre-con available on the Friday, 3/25, before SQL Saturday 67.

Here is the link to register, http://sqlsat67troubleshootingprecon-estw.eventbrite.com/

The presenter is Kevin Kline, MVP, SQL Server expert, author, keynote presenter, Sql cruiser, former PASS President, etc.

This presentation will be held at the Quest office in Warrenville, IL.

Here is the detailed information on the pre-con session:

Cost: $99 (til March 11th), $119

Location Quest, 4320 Winfield Rd # 500 Warrenville, IL 60555-4036

Troubleshooting & Performance Tuning Microsoft SQL Server 2008 R2

 

Presenter : Author, expert  and Microsoft MVP Kevin Kline                         (http://KevinEKline.com)

 

This full day seminar will teach you introductory, intermediate and advanced techniques for troubleshooting and performance tuning using SQL Server’s built-in Dynamic Management Views (DMVs), Performance Monitor, Profiler, Wait Stats, and Extended Events.  In troubleshooting, you’ll learn how to detect, diagnose, and resolve problems on SQL Server using the native tools, as well as how to determine where SQL Server is experiencing bottlenecks that prevent it from achieving optimal performance. Bring your laptop with SQL Server 2008 R2 and AdventureWorks R2 to work through the labs as Kevin demonstrates various techniques.  (SQL Server 2005 is also allowable, but you will not be able to demo those that are specific to SQL Server 2008 R2).
Topics covered include:
–          The “Rocks, Gravel, Sand” Troubleshooting Methodology
o   Understanding Wait Stats
o   Using beginner, intermediate, and advanced DMVs
o   Correlating Performance Monitor and Profiler
o   Extended Events
–          Determining meaningful key performance indicators thru benchmarking
–          Catch problems before your users do thru monitoring
–          Preventative Maintenance Best Practices
Seminar includes lunch and refreshments.

Attendees will receive troubleshooting scripts, posters detailing important DMVs and PerfMon counters, session video recordings on DVD, and a free signed copy of Kevin’s book Database Benchmarking.

SQL Saturday #67, filled with many free sessions and events, will be held the next day.  For more info and to register for SQL Saturday, please visit the event website:http://www.sqlsaturday.com/67/eventhome.aspx

SQL Saturday 67 Chicago Pre-con

We have big news for SQL Saturday #67 Chicago. There is going be a pre-con on Friday 3/25 at the Hampton Inn right around the corner from the SQL Saturday venue.

Here is the link to register, SQL Saturday #67 Pre-Conference: DBA Skills Upgrade Toolkit.

Questioning why you should attend the pre-con? Check out Noel McKinney’s take on it.

Before I give details about the event, let me introduce you the presenter. The pre-con will be presented by Eddie Wuerch (Blog|Twitter) from Indianapolis. I had the pleasure of meeting Eddie at SQL Saturday 45 in Louisville a few weeks ago. We had some interesting discussions about the extreme loads that hit his environment. With this daily experience, I’m sure Eddie will have some great things to share in the pre-con session. Eddie is also a regular presenter at SQL Saturdays and Indy PASS. Eddie tweeted this earlier. Need I say more: “@dawndevirgilio: A million rows of data isn’t cool, you know what’s cool? A billion…! #ETHub” <-we’re approaching a trillion, actually…

Here is the detailed information on the pre-con session:

The DBA Skills Upgrade Toolkit

Cost: $99 (before March 7)

Presenter: Eddie Wuerch

This full-day seminar is for DBAs and developers with DBA responsibilities looking to beef up their skillsets and simplify their lives with additional insight into their SQL Servers.  Eddie Wuerch takes his extensive experience as a speaker, trainer, mentor, and DBA in one of the largest and busiest SQL Server environments in the world and distills it into a collection of topics for DBAs managing databases of all sizes.

Topics included in the seminar include

  • Current and historical monitoring of server processes and resources
  • SQL I/O for the DBA:
    • RAID, disk options, and SANs
    • File and Filegroup planning, placement, and monitoring
    • Transaction log internals
    • TempDB considerations
  • Encryption for the DBA
  • Locking, Latching, and Deadlocks
  • Capacity planning (CPU, memory, and files)
  • Performance and operational troubleshooting techniques
  • Backup and recovery planning
  • Automation of standard tasks
  • Includes a large collection of T-SQL scripts for performing many of the tasks covered in the seminar

Seminar fee includes lunch, breaktime refreshments, and seminar materials.

SQL Saturday #67, filled with many free sessions and events, will be held the next day.  For more info and to register for SQL Saturday, please visit the event website:http://www.sqlsaturday.com/67/eventhome.aspx

Getting Drive Info, Part 5, the SSIS WMI Data Reader Task

In the final installment of the Getting Drive Info series (Part 1, Part 2, Part 3, Part 4), SSIS will again be used to collect and save the drive information. This WMI Data Reader Task will be used to collect the drive information.

WMI, Windows Management Instrumentation, is the Microsoft implementation of the industry standard Web-based Enterprise Management, WBEM, using the Common Information Model, CIM. WMI provides easy access to information about the computer hardware and software via classes like Win32_LogicalDisk. The classes can be queried with a SQL like language called WQL (SQL for WMI).

WMI Data Reader Task

The package uses the WMI Data Reader Task to gather the disk information. It requires a WMI connection and a WQL query to define what information to gather and from where.

Let’s start with the WMI connection configuration. Right click in the BIDS Connection Managers pane and click on New Connection… which will pop up the Add SSIS Connection Manager dialog.

Select WMI and click the Add… button. The WMI Connection Manager Editor will be displayed.

Configuring the WMI connection for the local computer is fairly simple. Set it up as shown above with the name of your choice of course and click OK. This configuration will only work if the account that the package will run under has rights to read data from the WMI provider. If there are security issues reading WMI data, other credentials will need to be provided or the account that runs the package given the rights to query the WMI provider.

In the case where the server being queried for WMI data is not the computer the package is running on, credentials will need to be supplied. If the account that is running the package has access on this other computer, then checking Use Windows Authentication should work. When the other computer is in a different domain or the account running the package does not have access, then a windows user and password will need to be supplied. This can be a domain account or a local windows account on the other computer. Always supply the user name with a domain or the other computer name. In highly secure environments remote WMI security access may need to be configured on the other computer. WMI security configuration will not be covered here.

The Namespace property can usually be set to \root\cimv2. There will be cases when the WMI class to be queried is under a different WMI namespace. These other namespaces can be browsed WMI CMI Studio or Scriptomatic 2.0. I also recommend these tools to browse the classes available and see the properties and values. A sample of the namespaces available is shown below.

Now that the connection is configured, work can begin on the WMI Data Reader Task.

Create a WMI Data Reader Task on the Control Flow designer pane.

Open the WMI Data Reader Editor. The SQL Server 2005 version is displayed. The SQL Server 2008 version has an additional editor called General when the Name and Description of the task can be set.

Let’s setup the WMI Options.

The WmiConnection has been set the one configured above.

WqlQuerySourceType is set to Direct Input, since the query will be statically defined in the WqlQuerySource property. WqlQuerySourceType can also be set to File connection or Variable. Use File connection if the query is defined in a file. Variable would be used if the query needs to be dynamic. A dynamic query is usually required if the WMI Data Reader Task is placed in a looping container or the query is dependent on the results of a previous task in the package.

WqlQuerySource needs to be set to a valid WQL query. In order to get the drive information, there are 2 different WMI classes that could be queried. The first, Win32_LogicalDisk, only seems to handle local attached storage. The query to get the drive information is:

SELECT DeviceId, Size, FreeSpace FROM Win32_LogicalDisk WHERE DriveType = 3

The other class to try is Win32_Volume. This class would probably need to be used to get the information for mount points. (Thanks Dave Levy!) Here is the query:

SELECT DriveLetter, Capacity, FreeSpace FROM Win32_Volume WHERE DriveType = 3

DriveType 3 limits the query to fixed drives. DriveType 2, removable drives, may need to be included. Add OR DriveType = 2 to the queries if all the drives expected are not being returned.

OutputType should be set to Data table so that the results from multiple drives can be returned. The data will end up in a variable that we will define later. The other values for OutputType are Property name and value and Property value.

OverwriteDestination should be set to OverwriteDestination. The other values are Keep Original and Append to destination. The SQL Server 2008 editor disables this property which really makes more sense when writing to a variable.

DestinationType should be set to Variable. The other value is File connection. Use File connection when the data needs to be written to a file. Changing this property changes the options in the next property Destination.

Destination should be set to a variable of type Object. The variable DriveInfo can be created from here or beforehand from the variables browser. Remember to watch the variable scoping. It should scoped at the package level.

That completes the configuration of the WMI Data Reader Task.

Data Flow Tasks

The contents of the DriveInfo variable need to be processed with a Data Flow Task. This will be accomplished with a Script Component used as a data source feeding an Oledb destination that will write the drive information to a table.

Create a Data Flow Task on the Control Flow designer pane and connect the WMI Data Reader Task to it.

The Script Component

Switch over to the Data Flow designer pane and drag a Script Component object from the Data Flow Transformations section of the Toolbox. The Select Script Component Type will appear. The Script Component can be a source of data, a transformer of data or a consumer of data. Choose Source as this Script Component is going to provide drive information in a format that an Oledb provider can consume and click OK. Rename the Script Component Get DriveInfo. All of the images below came from BIDS 2005. BIDS 2008 is slightly different.

Rename the Script Component Get DriveInfo.

Double click on the Script Component to open the editor. In SQL Server 2008 the Script editor is the default. Click on Inputs and Outputs first to display that editor. Now let’s add some output columns. Rename Output 0 to DriveInfo and add the columns:

  • Drive, data type string [DT_STR]
  • DriveSize, data type float [DT_R4]
  • FreeSpace, data type float [DT_R4]

Here are the column property details for the Drive column.

It is important to define the output columns first so they show up in the OutputBuffer object while the coding the script. The output buffer will be named DriveInfoBuffer in this case.

Switch Script Transformation editor to Script mode by clicking on Script in the left pane.

The ReadOnlyVariables property in the Custom Properties section needs to be set to User::DriveInfo so that the Data Table filled by the WMI Data Reader Task can be processed by the Script Component.

Now it’s time to add the code to get the drive information and send it to the output buffer. Click the Design Script… button and add the code below. The sample code is for the Win32_LogicalDisk query. If Win32_Volume is being used, change the column names to match that WMI query. Note that the User::DriveInfo variable has to be converted to a DataTable type. Here is the code in VB.Net for SSIS 2005.

Imports System
Imports System.Data
Imports System.Math
Imports Microsoft.SqlServer.Dts.Pipeline.Wrapper
Imports Microsoft.SqlServer.Dts.Runtime.Wrapper

Public Class ScriptMain
    Inherits UserComponent

    Public Overrides Sub CreateNewOutputRows()
        Dim dr As DataRow
        Dim dt As DataTable

        dt = CType(Variables.DriveInfo, DataTable)
        For Each dr In dt.Rows
            DriveInfoBuffer.AddRow()
            DriveInfoBuffer.Drive = dr("DeviceId").ToString()
            DriveInfoBuffer.TotalSize = CType(dr("Size"), Single)
            DriveInfoBuffer.FreeSpace = CType(dr("FreeSpace"), Single)
        Next
    End Sub

End Class

Here is the code in C# for SSIS 2008.

Make sure to set the ScriptLanguage property to Microsoft Visual C# 2008 in BIDS 2008 before clicking the Edit Script… button and pasting in the code below.

using System;
using System.Data;
using System.IO;
using Microsoft.SqlServer.Dts.Pipeline.Wrapper;
using Microsoft.SqlServer.Dts.Runtime.Wrapper;

[Microsoft.SqlServer.Dts.Pipeline.SSISScriptComponentEntryPointAttribute]
public class ScriptMain : UserComponent
{
    public override void PreExecute()
    {
        base.PreExecute();
    }

    public override void PostExecute()
    {
        base.PostExecute();
    }

    public override void CreateNewOutputRows()
    {
        DataTable di = (DataTable)Variables.DriveInfo;

        foreach (DataRow dr in di.Rows)
        {
            DriveInfoBuffer.AddRow();
            DriveInfoBuffer.Drive = dr["DeviceId"].ToString();
            DriveInfoBuffer.TotalSize = Convert.ToSingle(dr["Size"]);
            DriveInfoBuffer.FreeSpace = Convert.ToSingle(dr["FreeSpace"]);
        }
    }

}

Now we need to add the destination to store the drive information. Grab the OLE DB Destination from the Data Flow Destinations of the Toolbox and drag it onto the Data Flow designer and name it Put DriveInfo. Connect the output of the Get DriveInfo Script Component to Put DriveInfo.

 

In the OLE DB Destination Editor, set the OLE DB connection manager to the DBA_Tools connection created earlier and pick the DriveInfo table from the table or view drop down.

Here is the create table statement for the DriveInfo table used in the earlier parts of the series.

CREATE TABLE dbo.DriveInfo
( Drive VARCHAR(255)
, DriveSize FLOAT
, FreeSpace FLOAT)

Run the package and all the drive information for the computer should be stored in the DriveInfo table.

That concludes the Drive Information series. This is great pattern to follow to get any information via WMI and store it in a central database for server reporting. Have fun experimenting.

Un-SQL Friday 001 : Branding for Business

I really like Jen’s (MidnightDBA,#sqlawesomesauce) idea for the first Un-SQL Friday. I was struggling with the topic and had to read some of the posts from earlier today. No one had written from a business perspective yet, so I decided to write about my journey and branding for my company, Gerasus Software. I own and operate a single person ISV formally known as a microISV. Just because most of the branding effort is geared toward building the business does not mean I have no interest in helping the SQL Server community.

Let’s go back to 1998. At that time I was a DBA in Clearwater, FL. A group of us decided to start a user group. The Tampa Bay SQL Server User Group was born. Buck Woody (Blog|Twitter) and I worked together at the time and he was involved. I presented quite a bit back then. Information was scare. Support from Microsoft was limited. It was hard work. I was active in the replication news groups. That was defeating because someone from the UK would answer all the questions before I woke up. I also questioned why am I doing this? I moved on to another position and disappeared from the group. Buck took over as President of the group and you know the rest of that story.

Fast forward to 2003. I had moved back home to the Chicago area and was working on a large server migration. There were several hundred DTS packages that needed to be examined to determine how to work around connection objects with ip addresses and names that pointed to the old servers. I needed a tool that would search DTS packages. I start throwing together some VBscript to do the searching. It worked well enough. I looked for a commercial product, but found nothing. I had also been reading how Redgate got started and was inspired. Since there was nothing to search DTS packages on the market, I decided I would write a utility and sell it. Gerasus Software and DTS Package Search were born. I have a strong development background and really enjoy coding UIs. It gives that visual gratification that writing SQL is missing. I finished and launched the version 1.0 of the program in 2006. After adding improvements I went about adding SSIS support. Version 2.0, SSIS-DTS Package Search was released in 2009. Now I could come up for air start focusing a little more on branding.

Seth Godin says you should tell a story and here is the story about the name Gerasus. It is made up word. It’s difficult to come up with a company name that unique. This one is personal and says something about me. My parents are German, there is the “Ger”. I was born in Australia, adding “aus” originally and I grew up and live in the US. So we had Gerausus which was a bit clunky. It was shortened it to Gerasus. People do have trouble pronouncing it. The “a” is short like sat. Not so much of a branding win.

Since there was no other product like that on the market at the time, I could name it the words people would search on Google. Branding win.

The support from Microsoft is huge now. Visit Bizspark and see. I am fortunate to be enrolled in that program and get those benefits. Plus I get to play with all the latest stuff.

I discovered twitter and have a personal and business account. I really enjoy answering question on the #sqlhelp tag. I do some marketing on twitter, but mostly try to help and have fun. I started to blog and keep the posts technical and helpful, not just about the product. I started speaking again with a presentation on SSIS which is definitely about branding. I am also putting one together on Powershell, because I find it interesting and it will help me to learn more about it.

I went to my first SQL Saturday in Chicago this year and gave away copies of SSIS-DTS Package Search. The organizers Wendy Pasterick (Blog|Twitter), Aaron Lowe (Blog|Twitter) and Ted Krueger (Blog|Twitter) were very supportive. I also was fortunate enough to attend SQL Saturday 40. I’ve since started donating to copies of SSIS-DTS Package Search to every SQL Saturday since midsummer. It’s been great working with the organizers. I’m glad that I can support SQL Saturdays with prizes and help make them better and bigger. I will be much more involved in the next Chicago SQL Saturday.

Being the owner of a company has force me to learn many new skills and one of those is branding. It’s important. It’s been a great year for the company and I look forward to continuing to help SQL Saturdays and the SQL community at large.

So now that I’ve done all this brand building for me, my company, Jen and Sean, I have to ask a question. What is the pain point in your work life? Does someone make a solution for it? No? Then start coding, start building your brand and get that product out there. The community is here to help you and Microsoft is here to help you. Hope to catch you in the DBSsAtMidnight UStream chat room.

Troubleshooting Connectivity with UDL files

I had a new SSIS package that was ready to deploy to production. It was pulling data from Oracle and this was the first package being placed into production that was using Oracle as a source. Arrangements had been made to have the Oracle OLE DB drivers installed on the production SQL Server as the performance was determined to be better that the OLE DB driver Microsoft provided at the time. On the first run of the package in production, the Oracle connection failed validation with a very useless error message, Connection failed validation. Why? The production server did not have any client tools to test connectivity as the system administrator had merely installed the drivers. As this was financial institution and the server was locked down, the SQL Server service account was setup with minimal rights. I sat down with the production DBA to troubleshoot the issue. Fortunately we were able to login interactively using the SQL Server service account. We created a UDL file to test whether a connection could be made to the production Oracle server. The first thing we checked was the Provider tab which verified that the Oracle OLE DB driver was indeed installed. We then attempted to create a connection and an error was raised. The error showed that the SQL Server service account did not have rights to the directory where the Oracle driver was installed and could not read the tnsnames.ora file. We contacted the system administrator to grant the directory rights to the SQL server service account and the connection was able to be made on the next test. The SSIS package ran successfully as well.

Let’s take a closer look at UDL files.

UDL files are useful to:

  • troubleshoot connectivity issues.
  • prove that a computer can connect to a database server.
  • determine if the OLE DB driver you need is installed on the machine.
  • generate a connection string.

 

A UDL file can be created two different ways. If you have the “Hide extensions for known file types” option turned off in explorer, the UDL file can be created directly from the Desktop. Right click on the Destop and select New->Text Document. Rename the new file test.udl. Windows will pop the warning below. Click Yes.

If you haven’t seen it before, here is the “Hide extensions for known file types” option.

The second way is to open Notepad and save an empty file to the Desktop named test.udl. The file icon should look like this on new versions of Windows.

Here is the older version of the icon that would appear on Windows XP/2003.

Double click the newly created icon and the Data Link Properties dialog will be displayed.

The Connection tab is ready to create a connection to a SQL Server by default. In order to create a connection using a different driver, click on the Provider tab and select the driver.

Now when the Connection tab is selected, notice how the UI has changed to provide the appropriate options for the new driver.

The Advanced tab allows additional connection properties to be set.

The All tab shows all of the properties that can be set for the chosen provider. There are properties on this tab that may not be in the documentation provide by the vendor.

Now that we have seen all this fancy UI magic the question is “What is a UDL file”? It is just a test file with a connection string. Windows launches the UI based on the file extension.

Let’s create a UDL using the SQL Server OLE DB driver with windows authentication, pointing to the AdventureWorks database and with an application name of “Test UDL”. Here are the results.

; Everything after this line is an OLE DB initstring

Provider=SQLOLEDB.1;Integrated Security=SSPI;Persist Security Info=False;Initial Catalog=AdventureWorks;Data Source=.\SQL2005;Application Name=Test UDL

The UDL been around for a while, is available on any windows box and is a quick light weight way to troubleshoot your connections.

Getting Drive Info, Part 4, the SSIS Script Component

In this installment of the Getting Drive Info series (Part 1, Part 2, Part 3), SSIS will be used to collect and save the drive information. SSIS provides multiple ways to accomplish this task. This installment will focus on using the Script Component to collect the drive information.

The Script component will be used as the source of a Data Flow where the DriveInfo class of the System.IO namespace will be used to collect the drive information. The DBA_Tools database and DriveInfo table from the earlier installments will also be needed.

Fire up BIDS and create a new project. Name the package something appropriate.

Drag a Data Flow task onto the Control Flow designer and name it DriveInfo.

Create a new Connection Manager that points to the DBA_Tools database on your SQL Server.

Bring the Data Flow designer into focus. Grab the Script Component from the Data Flow Transformations of the Toolbox and drag it onto the Data Flow designer. The Select Script Component Type dialog will appear. The Script Component can be a source of data, a transformer of data or a consumer of data. Choose Source, since this Script Component is going to provide drive information and click OK. Rename the Script Component Get DriveInfo.

Double click on the Script Component to open the editor. Now let’s add some output columns. Rename Output 0 to DriveInfo and add the columns:

  • Drive, data type string [DT_STR]
  • DriveSize, data type float [DT_R4]
  • FreeSpace, data type float [DT_R4]

Here are the column property details for the Drive column.

It is important to define the output columns first so they show up OutputBuffer object while the coding the script. The output buffer will be named DriveInfoBuffer in this case.

Now it’s time to add the code to get the drive information and send ti to the output buffer. Here is the code in VB.Net for SSIS 2005.

Imports System
Imports System.Data
Imports System.IO
Imports System.Math
Imports Microsoft.SqlServer.Dts.Pipeline.Wrapper
Imports Microsoft.SqlServer.Dts.Runtime.Wrapper

Public Class ScriptMain
    Inherits UserComponent

    Public Overrides Sub CreateNewOutputRows()

        For Each di As DriveInfo In DriveInfo.GetDrives()
            If di.IsReady And di.DriveType = DriveType.Fixed Then
                Me.DriveInfoBuffer.AddRow()
                Me.DriveInfoBuffer.Drive = di.Name
                Me.DriveInfoBuffer.DriveSize = di.TotalSize
                Me.DriveInfoBuffer.FreeSpace = di.TotalFreeSpace
            End If
        Next

    End Sub

End Class

Here is the code in C# for SSIS 2008.

Make sure to set the ScriptLanguage property to Microsoft Visual C# 2008 in BIDS 2008 before clicking the Edit Script… button and pasting in the code below.

using System;
using System.Data;
using System.IO;
using Microsoft.SqlServer.Dts.Pipeline.Wrapper;
using Microsoft.SqlServer.Dts.Runtime.Wrapper;

[Microsoft.SqlServer.Dts.Pipeline.SSISScriptComponentEntryPointAttribute]
public class ScriptMain : UserComponent
{
    public override void PreExecute()
    {
        base.PreExecute();
        /*
          Add your code here for preprocessing or remove if not needed
        */
    }

    public override void PostExecute()
    {
        base.PostExecute();
        /*
          Add your code here for postprocessing or remove if not needed
          You can set read/write variables here, for example:
          Variables.MyIntVar = 100
        */
    }

    public override void CreateNewOutputRows()
    {
        /*
          Add rows by calling the AddRow method on the member variable named "<Output Name>Buffer".
          For example, call MyOutputBuffer.AddRow() if your output was named "MyOutput".
        */
        foreach (DriveInfo di in DriveInfo.GetDrives())
        {
            if (di.IsReady && di.DriveType == DriveType.Fixed)
            {
                DriveInfoBuffer.AddRow();
                DriveInfoBuffer.Drive = di.Name;
                DriveInfoBuffer.DriveSize = di.TotalSize;
                DriveInfoBuffer.FreeSpace = di.TotalFreeSpace;
            }
        }
    }
}

Now we need to add the destination to store the drive information. Grab the OLE DB Destination from the Data Flow Destinations of the Toolbox and drag it onto the Data Flow designer and name it Put DriveInfo. Connect the output of the Get DriveInfo Script Component to Put DriveInfo.

In the OLE DB Destination Editor, set the OLE DB connection manager to the DBA_Tools connection created earlier and pick the DriveInfo table from the table or view drop down.

Here is the create table statement for the DriveInfo table used in the earlier parts of the series.

CREATE TABLE dbo.DriveInfo
( Drive VARCHAR(255)
, DriveSize FLOAT
, FreeSpace FLOAT)

Now you are ready to start collecting and saving the drive information on your SQL Server. I will leave it to you to explore the DriveInfo object to see what other bits of information that may be useful to collect and store for the drives. The only limitation of this technique is that the package can only get the drive information on the server that the package is running on. The package would need to be deployed and run on every SQL Server where it is desried to collect the drive information. The last part of this series will address this limitation by implementing the collection of drive information using the WMI Task in SSIS.

Getting Drive Info Part 3, CLR

In the first two parts, (Part 1, Part 2), of the Getting Drive Info series the techniques to gather drive info with methods that will work on SQL Server 2000 were presented. Now it is time to move on to look at the options that the newer versions of SQL Server can use. In this article the drive info will be retrieved via the CLR. Andy Novick (Blog) wrote the code to accomplish this in his article, CLR Function to return free space for all drives on a server, on MSSQLTips.com. Security and environment set up were not covered in this article. Rather than reinvent the wheel, the code from the referenced article will be used. I will discuss security and the setup I created to run it .

The first attempt was to build and deploy to the master database. This resulted in the following error:

Msg 6522, Level 16, State 2, Line 1
A .NET Framework error occurred during execution of user-defined routine or aggregate “drive_info”:
System.Security.SecurityException: Request for the permission of type ‘System.Security.Permissions.SecurityPermission, mscorlib, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089’ failed.
System.Security.SecurityException:
at System.Security.CodeAccessSecurityEngine.Check(Object demand, StackCrawlMark& stackMark, Boolean isPermSet)
at System.Security.CodeAccessPermission.Demand()
at System.IO.Directory.GetLogicalDrives()
at System.IO.DriveInfo.GetDrives()
at UserDefinedFunctions.drive_info()

After getting the above error and not really understanding the ramifications of an unsafe assembly, I decide it was time to do some reading. I found that the book, A Developers Guide to SQL Server 2005, had an excellent discussion of the CLR and the different assembly security types. Now I was ready to create an environment for this function.

Since this was a function for DBAs to use for system monitoring, I created a new database named DBA_Tools and set the owner to sa. In order to let an Unsafe assembly execute, the TRUSTWORTHY option needs to be enabled. Turn on Trustworthy with this command.

ALTER DATABASE DBA_Tools SET TRUSTWORTHY ON

Make sure that the CLR server option is enabled, otherwise you can’t execute the CLR function. This can be accomplished with this script.

EXECUTE sp_configure 'show advanced options', 1
RECONFIGURE WITH OVERRIDE

EXECUTE sp_configure 'clr enabled', 1
RECONFIGURE WITH OVERRIDE

One thing that I did notice is having the CLR server option enabled or disabled had no impact on deploying the function. This is a great feature if you typically don’t want the CLR enabled all of the time. The CLR can be enabled by the script that runs drive_info() function and then be disabled immediately after.

Moving on to the Visual Studio setup, make sure the connection in the CLR database project points to the DBA_Tools database. Set the Assembly name property on the Application tab of the Solution property page to something more appropriate.

Set the Assembly Name

The CLR function performed as advertised after setting everything up correctly. The function was built in both Visual Studio 2005 and 2008 and deployed to the corresponding versions of SQL Server. The setup worked on both SQL Server 2005 and 2008. Here is a sample output.

Results of the drive_info function

Utilizing the table, DriveInfo, that was used in the previous installments, the script to populate it would be:

INSERT INTO DriveInfo
SELECT letter   AS Drive
     , total_mb AS DriveSize
     , free_mb  AS FreeSpace 
  FROM drive_info()
 WHERE type = 'Fixed'

Although this solution works well enough, there is some risk with this approach. Based on how an Unsafe assembly is handled with in the SQLOS, there is a risk for memory leaks when an unhandled exception occurs. If a solution can be created that uses alternative SQL Server features rather than an Unsafe CLR assembly, I would choose the alternative. Speaking of alternatives, that where the next installments come in. They will cover using SSIS to get the drive info via the Script Component and WMI.