SSIS and Oracle

Working with Oracle and SSIS/DTS on a 64-bit windows server has been a painful experience.

Here are the steps to connect to Oracle if you’re setting up a new SSIS package.

  1. Create a connection using the .net Providers\OracleClient Data Provider
  2. Use the DataReader source type.  Select the Oracle connection you created in step one.
  3. Use whatever destination you need to – it doesn’t matter at this point.  I typically use the OLE DB for SQL Server destinations.
  4. Schedule as a job with the integration services type step.  No 32 bit DTEXEC modification needed as was the case with DTS steps that hit Oracle. 

You may hit some quirky errors with overflows or numeric data types.  It’s good to make sure you have the latest oracle patchset applied before proceeding with workarounds.

Backups

I’ve always been a stickler for backups, even before I was a sysadmin.  I create music on my home computer, creating massive volumes of data just waiting to disappear from my life.

I have been using SuperDuper and TimeMachine on OS X.  I don’t backup my windows machine, since it is a media center we mostly just stream stuff off the web.

When I leave town I throw the backup drives in an unlocked fireproof save with a note begging thieves not to steal it.  This data is truly unique and irreplaceable.  

But what if there was a fire when I didn’t have my drives in the safe?  I’d be very sad, that’s what.

There are a few online backup services which are starting to look very attractive.  Most offer 2GB for free.  I’ll be needing at least 50GB for audio mixes.  Actual song files are another story….maybe 300GB or so.  Then there are various video, picture and music files I’d hate to lose.  So it’s looking like I’ll be needing around 500GB to lock it all up.

Enter Mozy – $4.95 a month for unlimited storage space.  I assume this is for one computer.  What a deal.  That puts a year at $60.  40 years at $2400.

Sold.

Reporting Services and Sharepoint Integrated mode

Microsoft’s licensing has once again saved me some work.  To run Reporting Services in SharePoint integrated mode , you will need a SharePoint license on your reporting server.  A SOAP endpoint is enabled on the report server which lets SharePoint talk to the report server.  Seems somewhat of a strange departure from the typical MSFT licensing.  We will not be buying another license, needless to say.

Helping out the auditors

If you’d like to give an auditor their very own copy of your data on their machine and your DB is less than 4 GB, then read on.  As a responsible IT person, you won’t be putting sensitive information on this auditor machine, unless it is properly secured, electronically and physically, right?

1. Set up the auditor machine

Install SQL Express on the auditor machine. Enable remote connections via TCP/IP with the config tools. Grant permissions for your agent account on the source server to be an admin on the auditor’s machine. You’ll also need to grant access for the agent account to write to a backup directory on the auditor machine.

2. Create a script that will restore the database and do some other steps. This script is referenced from sqlcmd in step 3. My script does these things:

MASTER.dbo.usp_killusers ‘MyDB’ –something similar to this script ->> http://www.sqlservercentral.com/articles/Administration/71/

RESTORE DATABASE MyDB FROM DISK = ‘\\workstation\backup\MyDB_Auditor.bak’
WITH REPLACE,
MOVE ‘MyDBtrainingsys’ TO ‘c:\Program Files\Microsoft SQL Server\MSSQL.1\MSSQL\Data\MyDB.mdf’,
MOVE ‘MyDBtrainingdata’ TO ‘c:\Program Files\Microsoft SQL Server\MSSQL.1\MSSQL\Data\MyDB_1.ndf’,
MOVE ‘MyDBtraininglog’ TO ‘c:\Program Files\Microsoft SQL Server\MSSQL.1\MSSQL\Data\MyDB_2.ldf’

–Drop our auditing ddl trigger from the database
IF EXISTS (SELECT * FROM sys.triggers WHERE name = N’DBA_Audit’ AND parent_class=0)
DROP TRIGGER [DBA_Audit] ON DATABASE

EXEC sp_changedbowner ‘sa’

–Add rights for whomever needs access
CREATE USER [Domain\AuditorLogin] FOR LOGIN [Domain\AuditorLogin]
EXEC sp_addrolemember N’db_datareader’, N’Domain\AuditorLogin’
EXEC sp_addrolemember N’db_datawriter’, N’Domain\AuditorLogin’
EXEC sp_addrolemember N’db_ddladmin’, N’Domain\AuditorLogin’

3. Create the job with these steps on your source server.

(T-SQL type) – Create backup on workstation
BACKUP DATABASE MyDB TO DISK = ‘\\workstation\Backup\MyDB_Auditor.bak’

(OS Command) – Restore
sqlcmd -i \\Share\auditorRestore\MyDBAuditorRestore.sql -Sworkstation\sqlexpress -o \\share\sf.log

(OS Command) – Cleanup
del \\workstation\backup\MyDB_Auditor.bak

Partied with Microsoft today

I attended the Microsoft technology roadshow today. After being paged out at 4:30 due to some I/O nonsense related to statistics updates which I stayed up until 6:30 for trying to prevent happening again, I made my way to downtown Phoenix.

Arrived just in time to see the new nastyness that is windows 7/server 2008 R2 licensing.

Sharepoint 2010

Looks nice. Ajaxified. Working in Firefox with a silverlight plugin installed I would assume. Although they do also claim it works in smartphones so maybe it is just ajax. Sharepoint = pricey.

BI/Insight

This is the real magic. Excel is nothing to laugh at. Excel 2007 has some nice features when it comes to connecting to alternate data sources such as analysis services. I have some plans to work with this tech in the future. 2010 takes it further and adds a dash of manageability which actually making things look good. When I say good, I mean Apple good…almost. They even have little carousels for browsing reports.

They have improved manageability of user created content by leaps and bounds. Users can publish Excel docs. Other users can create reports based on those docs. If usage of the base Excel doc starts to accelerate, you will know it and can act accordingly by taking over the management of the ‘application’. That’s what I’m talking about Napoleon.

A hotshot consultant showed us how he did a break out/data driven report in reporting services. I was hoping for something a little more impressive from this ‘prodigy’, as his boss referred to him.

I see ‘self-service’ BI as a bit of a pipe dream. MSFT is putting the framework in place so that if you users manage to figure out this mess of tech, you’ll at least know what is bombing on your servers.

Also, Gemini has been officially dubbed PowerPivot. Reminds me a bit of Cognos Power Play.

Seeing all the connections for all SSIS packages on your server

So you have a mess of packages on your server and you’d like to know what they connect to without going into each package? Yeah, me too.

Create a visual studio c# console application with Visual Studio (express will probably work).

Paste this code in:

using System;
using System.Collections.Generic;
using System.Text;
using Microsoft.SqlServer.Dts.Runtime;
using System.Text.RegularExpressions;
using System.IO;

namespace SSISPackageInspector
{


public static class CsvWriter
{
private static StreamWriter _writer = new StreamWriter(@"c:\temp\ssis_list.csv");

public static StreamWriter Writer
{
get { return _writer; }
}
}




class Program
{

static void Main(string[] args)
{
string outfile = null;
string folder = null;
string server = null;


if (args.Length == 0)
{
System.Console.WriteLine("Please enter a numeric argument.");
Console.Read();
return;
}
else
{
int i = 0;
foreach(string s in args)
{
if (args[i].Substring(0,2)==@"\o")
{
outfile = args[i].Substring(2);
}

if (args[i].Substring(0, 2) == @"\f")
{
folder = args[i].Substring(2);
}

if (args[i].Substring(0, 2) == @"\s")
{
server = args[i].Substring(2);
}

i++;
}

}


if (outfile != null)
{



}
//Look up All package names for a server
PrintDirectoryContents(folder,server);

if (outfile != null)
{
// CsvWriter.Close();
}
Console.Read();
}


static void PrintDirectoryContents(string path, string server)
{


string sqlFolder;
string sqlServer;

Application ssisApplication;
PackageInfos sqlPackages;

sqlServer = server;

ssisApplication = new Application();


sqlFolder = path;
sqlPackages = ssisApplication.GetDtsServerPackageInfos(sqlFolder, sqlServer);
if (sqlPackages.Count > 0)
{
Console.WriteLine("Packages stored in " + path);
foreach (PackageInfo sqlPackage in sqlPackages)
{
Console.WriteLine(sqlPackage.Name);
Console.WriteLine(" " + sqlPackage.Flags);

if (sqlPackage.Flags.ToString() == "Package")
{
GetConnections(@"\" + sqlFolder,sqlPackage.Name,server);

}

if (sqlPackage.Flags.ToString() == "Folder")
{
PrintDirectoryContents(path + @"\" + sqlPackage.Name,server);
}

}
}
}


static void GetConnections(string path,string pkgname,string server)
{

// Create a package, and retrieve its connections.
Application mApplication = new Application();

//Package pkg = mApplication.LoadPackage(@"C:\temp\DBA_Repository_Load.dtsx", null);
Package pkg = mApplication.LoadFromDtsServer(path + @"\" + pkgname , server, null);
Connections conns = pkg.Connections;

//Create the Enumerator for each connection
ConnectionEnumerator myEnumerator = conns.GetEnumerator();
Console.WriteLine("------" + pkg.Name.ToString() + "----------");

while ((myEnumerator.MoveNext()) && (myEnumerator.Current != null))
{
string myLine = null;
myLine += server + "," + path + "," + pkg.Name.ToString() + "," + myEnumerator.Current.Name;

Regex r = new Regex("([^=;]*)=([^;]*)");

MatchCollection mc = r.Matches(myEnumerator.Current.ConnectionString);

Console.WriteLine("Conn Name:" + myEnumerator.Current.Name.ToString());

string s = myEnumerator.Current.ConnectionString.ToString();
if (s.Length > 0)
{
if (s.Substring(0, 2) == @"\\")
{
Console.WriteLine(" " + myEnumerator.Current.ConnectionString);
myLine += "," + myEnumerator.Current.ConnectionString;

}
}

foreach (Match m in mc)
{
if (m.Groups[1].ToString() == "Data Source" || m.Groups[1].ToString() == "Initial Catalog")
{
if (m.Groups[1].ToString() == "Data Source")
{
Console.WriteLine(" Server:" + m.Groups[2]);
myLine += "," + m.Groups[2];
}
}

if (m.Groups[1].ToString() == "Initial Catalog")
{
Console.WriteLine(" Database:" + m.Groups[2]);
myLine += "," + m.Groups[2];
}
}

CsvWriter.Writer.WriteLine(myLine);
//Console.WriteLine(myLine);


}

Console.WriteLine("------End Package-------");


}
}

}

Go to project > properties > debug > start options

Enter these parameters

\sMyServer \fmsdb\mydirectory \oc:\temp\ssis_list.csv

The first is your SSIS server you’d like to connect to.

The second is your ssis directory to process – you can just use msdb and do the whole shebang.

The final is your output file. Duh.

You can now press play in VS and see the console window spin by. Look for your file in the directory you specified. You can do all kinds of spiffy stuff to packages using .net.

More information on running and managing packages programmatically is available from the good folks at MSFT.

DTUtil

Constantly using ‘Save a Copy As..’ in BIDS to save a SSIS package to the server annoys me to no end. Does MSFT actually use this crap?

DTUTIL.exe is a tool that comes in your 90\DTS\binn folder. Check this out:

C:\Program Files\Microsoft SQL Server\90\DTS\Binn>

dtutil /En SQL;subfolder\packagename;5 /Fi “C:\
Documents and Settings\me\My Documents\Visual Studio 2005\Projects\ProjectName\blah\
PackageName.dtsx” /DestServer devserver

note: For the first option (/En), the subfolder does not need to include the MSDB folder which you will see when browsing SSIS.

Paste that into your cmd window and you’ll be saving with two keystrokes. Your options may change, but with the help of the online docs, you’ll be happier.

If dtutil isn’t recognized you aren’t in the proper directory. You can also add the binn directory to your PATH variable. Find the PATH variable and add “;C:\Program Files\Microsoft SQL Server\90\DTS\Binn” to the end of the string. Note the semicolon at the start.

Rebuilding a Maintenance Plan

Today I needed to rebuild a maintenance plan with specially selected databases. Anyone who has done it know that the interface for selecting databases lacks in one key area: the window cannot be resized and shows about 7 databases. My solution was to store the selected dbs for the plan in a temp table and then update the new plan with that info. Even the Green DBA knows a bit about the system tables, and isn’t afraid to play with them. That said, it’s a good idea to take a backup of the msdb database before doing anything to it.

SQL Server 2000 stores info related to dts and maintplans in the msdb. The two tables I’m concerned with are sysdbmaintplans and sysdbmaintplans_databases. There is an entry in the former for each maintenace plan, and an entry for each database used for the maintenance plan in the latter. They are associated by a guid. After knowing this, the rest is pretty easy. Definitely better than checking 80 boxes for me.

–insert database and plan id from plan we need to rebuild into a temp table
select my_db_name, plan_id into user_myDBsforOtherPlanfrom
(SELECT sysdbmaintplan_databases.database_name AS my_db_name, sysdbmaintplan_databases.plan_id as plan_id
FROM sysdbmaintplans
INNER JOIN sysdbmaintplan_databases ON sysdbmaintplans.plan_id = sysdbmaintplan_databases.plan_id
WHERE (sysdbmaintplans.plan_id = ‘{699A6A76-2BC3-4EA6-BAF1-0E268F82B095}’)) myDBsforOther

— go create your new maintplan and just select 1 db
–find the new guid and substitute in the queries below
select * from dbo.sysdbmaintplan_databases

–we need to find the plan id for the new plan and update our temp table, so they match. This could just go in the third step as a constant i suppose.

update user_myDBsforOtherPlanset plan_id = ‘{B9051EA2-F303-4E4F-BB17-FBF26E9B0C48}’

–insert the rows into the sysdbmainplan_databases table – except the one we added to the newly created plan

insert into dbo.sysdbmaintplan_databases (plan_id, database_name) select my.plan_id, my.my_db_name from dbo.user_myDBsforOtherPlan as my , dbo.sysdbmaintplan_databases notmine
where my.plan_id = notmine.plan_idand my.my_db_name <> notmine.database_name

Copying a database from production to development – a refresh

Todays challenge was to move a production database down to development nightly. It needed to happen after 4pm each weekday. Easy right?

This was all to be done in sql 2005, but the same ideas could be applied to 2000.

My plan was to create a backup at 4pm, copy it to the destination server and then do a restore.

1. Create the backup.

Normally sql will try to append date and time on the end of the name. I needed the same name all the time, so I made that happen. Saved to a local drive with the name of backup_export.bak .

2. Copy the backup.

This was a foray into unexplored territory for me. DOS seemed like a good solution – since sql server only seems to be able to back up to local drives (am I right about this?). I found the xcopy command and created a .bat file which looked like this:

t:
cd backup\inventory
xcopy inventory_export.bak file://myserver/e$/BACKUPs/Inventory
/Y

First i change to the t: drive, where my backup is and then move to the inventory directory. The xcopy command is next which says xcopy myfile tohere
/Y means to not confirm overwriting, which is what I want. Being inexperienced I tried a few ridiculous things before using the UNC path to get straight to my server.
I scheduled this .bat file to run with windows scheduled tasks shortly after my database backup is created. The backup was small and only took 30 seconds to create.
3. Set up the restore
Created a maintenance plan with a restore task on the development server to restore the file. This is the trickyest part. We go in and modify the job that the maintenace plan creates and add steps before and after to do prep and cleanup.
First we disconnect users who are connected – otherwise it won’t restore.
ALTER DATABASE dbName SET SINGLE_USER WITH ROLLBACK IMMEDIATE
Then we restore the database
Next we need to fix the sql logins which have been messed up by copying a database from one server to another:
DECLARE @SQLString VARCHAR(7500)
SELECT @SQLString = ”
SELECT @SQLString = @SQLString + ‘EXEC sp_Change_Users_Login ”UPDATE_ONE”,’+QUOTENAME([Name],””)+’,’+QUOTENAME([Name],””)+’; ‘FROM SysUsersWHERE IsSQLUser = 1AND (SId IS NOT NULL AND SId <> 0x0)AND SUSER_SNAME(SId) IS NULLORDER BY [Name]

PRINT @SQLString
EXEC (@SQLString)
Next you need to fix any windows users which were there by granting them access. I didn’t have this problem, but you just need to do a grant for each windows user in the db.
Here is some info on doing that in 2005 sp2:
ALTER USER – addition of WITH LOGIN clause
This new syntax for ALTER USER allows remapping a user to another login, by changing the user’s SID value to match the login’s SID. This can be used to repair orphaned users. It works for both Windows and SQL Server logins, unlike sp_change_users_login, which worked only for SQL Server logins. This should become the preferred command for fixing orphaned users. If the user is a Windows user and has a Windows user name (domain\user), then the user will be automatically renamed to the login name as part of the remapping operation. from laurentiu
As Laurentiu says, beware using sp_change_users_logon ‘report’ to find orphaned users – it only returns sql users, not windows.