Restore Database From Latest Restore

Here is a bit of code, adapted from Nigel Rivett’s original. You can use this to copy one database over another or restore the most recent database over itself.

--BEGIN TRAN

IF EXISTS ( SELECT *
FROM dbo.sysobjects
WHERE id = OBJECT_ID(N'[dbo].[s_RestoreLatestBackup]')
AND OBJECTPROPERTY(id, N'IsProcedure') = 1 )
DROP PROCEDURE [dbo].[s_RestoreLatestBackup]
GO

CREATE PROC s_RestoreLatestBackup
@dbname VARCHAR(128) ,
@destdbname VARCHAR(128) ,
@localBackupPath VARCHAR(200) ,
@localDBPath VARCHAR(200) ,
@localLogPath VARCHAR(200) ,
@recipients VARCHAR(128),
@filenamesuffix VARCHAR(50)
AS /*
exec s_RestoreLatestBackup
@dbname = 'hrmprot' ,
@destdbname = 'HRMPROT2',
@localBackupPath = 'K:\Backups\' ,
@localDBPath = 'I:\Data\' ,
@localLogPath = 'J:\Logs\' ,
@recipients = 'myemailaddress',
@filenamesuffix = '_2'
*/

DECLARE @cmd VARCHAR(2000) ,
@filename VARCHAR(128) ,
@s VARCHAR(128) ,
@i INT ,
@d DATETIME ,
@sql NVARCHAR(2000) ,
@StartDate DATETIME

SELECT @StartDate = GETDATE()

-- drop database
IF EXISTS ( SELECT *
FROM master..sysdatabases
WHERE name = @destdbname )
BEGIN
SELECT @cmd = 'drop database ' + @destdbname
EXEC (@cmd)
END

-- get latest backup filename
SELECT @cmd = 'dir /B ' + @localBackupPath + @dbname + '\*.*'
PRINT @CMD
CREATE TABLE #a ( s VARCHAR(2000) )
INSERT #a
EXEC master..xp_cmdshell @cmd

DELETE #a
WHERE s IS NULL
OR s NOT LIKE '%_backup_%'

SELECT *
FROM #a

SELECT @filename = MAX(s)
FROM #a

-- Get files in backup
SELECT @cmd = 'restore filelistonly from disk = ''' + @localBackupPath
+ @dbname + '\' + @filename + ''''
PRINT @cmd
CREATE TABLE #files
(
[LogicalName] NVARCHAR(128) ,
[PhysicalName] NVARCHAR(260) ,
[Type] CHAR(1) ,
[FileGroupName] NVARCHAR(128) ,
[Size] NUMERIC(20,0) ,
[MaxSize] numeric(20,0) ,
[FileId] BIGINT ,
[CreateLSN] NUMERIC(25, 0) ,
[DropLSN] NUMERIC(25, 0) ,
[UniqueId] UNIQUEIDENTIFIER ,
[ReadOnlyLSN] NUMERIC(25, 0) ,
[ReadWriteLSN] NUMERIC(25, 0) ,
[BackupSizeInBytes] BIGINT ,
[SourceBlockSize] INT ,
[FileGroupId] INT ,
[LogGroupGUID] UNIQUEIDENTIFIER ,
[DifferentialBaseLSN] NUMERIC(25, 0) ,
[DifferentialBaseGUID] UNIQUEIDENTIFIER ,
[IsReadOnly] BIT ,
[IsPresent] BIT ,
[TDEThumbprint] VARBINARY(32)
)
INSERT #files
EXEC ( @cmd
)

SELECT *
FROM #files
-- buld the restore command
SELECT @cmd = NULL ,
@s = ''

WHILE @s < ( SELECT MAX(LogicalName)
FROM #files
)
BEGIN
SELECT @s = MIN(LogicalName)
FROM #files
WHERE LogicalName > @s


SELECT @cmd = COALESCE(@cmd + ', move ', '') + '''' + @s
+ ''' to '''
+ CASE WHEN #files.Type = 'D' THEN @localDBPath
ELSE @localLogPath
END
+ CASE WHEN #files.Type = 'D' THEN + LogicalName + @filenamesuffix + '.mdf'
ELSE LogicalName + @filenamesuffix + '.ldf'
END
+ ''''
FROM #files
WHERE LogicalName = @s
END
SELECT @cmd = 'restore database ' + @destdbname + ' from disk = '''
+ @localBackupPath + @dbname + '\' + @filename + ''' with move '
+ @cmd -- + ', standby = ''' + @localBackupPath + 'standby.fil'''

PRINT @cmd

--restore the database
select (@cmd)

exec (@cmd)

drop table #files
drop table #a

-- Check that the restore was successful
select @sql = 'select @i = count(*) from ' + @dbname + '..sysobjects select @d = crdate from master..sysdatabases where name = ''' + @dbname + ''''
exec sp_executesql @sql, N'@i int out, @d datetime out', @i out, @d out

/*
if @d > @StartDate and @i > 20
begin
if @recipients is not null
begin
select @cmd = 'restore ' + @filename + ' completed successfully - started ' + convert(varchar(8), @StartDate, 112) + ' ' + convert(varchar(8), @StartDate, 112) + ' ' + convert(varchar(8), @StartDate, 108)
exec master..xp_sendmail @subject = @cmd, @recipients = @recipients, @message = @@servername
end
end
else
begin
select @cmd = '********************* restore ' + @filename + ' failed - started ' + convert(varchar(8), @StartDate, 112) + ' ' + convert(varchar(8), @StartDate, 108)
if @recipients is not null
begin
exec master..xp_sendmail @subject = @cmd, @recipients = @recipients, @message = @@servername
end
raiserror(@cmd, 16, -1)
end
*/
go

--COMMIT TRAN

Creating Restore Statements With Powershell

WARNING: Use caution in running scripts you find on the internet. You are responsible for the well being of your systems!

We’re sitting around waiting for the SAN monkeys to fling some crap our way – not the SAN guys, the SAN monkeys, kinda like gremlins, but not as bad. Our primary cluster is down, which runs 100+ databases. Maybe the disks will come up – or maybe they won’t.

Assessing the situation, I found that the MSDB was not in full recovery mode. People, please don’t run around changing DBs to simple to ‘save space’ without thinking the ramifications through. In this case, we lose all backup history so we now need to figure out what backup file is the most recent and then which log files come after that .bak file – then we slam them together into a file, leaving the NORECOVERY clause off the last log restore statement. If you think I’m doing this by hand, you’re crazy.

We make an assumption here. Each file holds only one backup set. This is how we do our backup files, so I’m OK with this assumption.


$DatabaseBackupRoot = “\\MYSERVA\e$\Backups\"
$fix = ""

$DatabaseBackupDirs = gci $DatabaseBackupRoot

foreach ($DatabaseBackupDir in $DatabaseBackupDirs)
{
$DB = $DatabaseBackupDir.Name

$BackupRestore = ""
$LogRestore = ""
$LogRestore = ""
$BackupRestore = ""
# Parameters

$Path = “\\MYSERVA\e$\Backups\$DB”;
$FullPattern = “*.bak”;
$LogPattern = “*.trn”;

# Enumerate Last Full Back
$FullBackup = Get-ChildItem $Path -Filter ($DB + $FullPattern) `
| sort Name `
| Select-Object -Last 1;

# Enumerate Log Backups
$LogBackups = Get-ChildItem $Path -Filter ($DB + $LogPattern) `
| Where-Object { $_.LastWriteTime -gt $Fullbackup.LastWriteTime} `
| sort Name;

$LogBackupsCount = $LogBackups.Count

$BackupFileName = $FullBackup.Name
# if the backupfile is null, skip this
if ($BackupFileName)
{

if ($LogBackupsCount > 0)
{
$BackupRestore = "RESTORE DATABASE [$db] FROM DISK = N'$DatabaseBackupRoot$DB\$BackupFileName' WITH FILE = 1, NORECOVERY, NOUNLOAD, STATS = 10
GO
"
}
else
{
$BackupRestore = "RESTORE DATABASE [$db] FROM DISK = N'$DatabaseBackupRoot$DB\$BackupFileName' WITH FILE = 1, NOUNLOAD, STATS = 10
GO
"
}

}
else{
#We get a list in the output of emtpy directories.
Write-Output $DB
}

$counter = 0
$len = $LogBackups.Count

foreach ($LogBackup in $LogBackups)
{
$counter++

$LogBackupFile = $LogBackup.Name;
#Test for no log/empty directory
if($LogBackupFile)
{
if($counter -lt $len)
{
$LogRestore = $LogRestore + "
RESTORE LOG [$db] FROM DISK = N'$DatabaseBackupRoot$DB\$LogBackupFile' WITH FILE = 1, NORECOVERY, NOUNLOAD, STATS = 10
GO
"
}
else
{
$LogRestore = $LogRestore + "
RESTORE LOG [$db] FROM DISK = N'$DatabaseBackupRoot$DB\$LogBackupFile' WITH FILE = 1, NOUNLOAD, STATS = 10
GO
"

}
}

}

$RestoreOneDatabase = $BackupRestore + $LogRestore

$RestoreAllDatabases = $RestoreAllDatabases + $RestoreOneDatabase

}

$RestoreAllDatabases | out-file "c:\temp\restore-MYSERVA.sql"

So now we’ve got a SQL statement with all these restores compiled together and separated with GO statements. Pretty neat. I’d probably run small blocks of them and fix any issues the script might have.

Monitoring SSAS for process failures

When a cube fails to process, you don’t want to hear it from your customer. As the last step in our datawarehouse load, I was processing a cube, using the script generated from the SSMS dialog. That was working fine most of the time, but sometimes we’d have a development failure after some changes had been made during the day.

Here is what worked. Notes on this – don’t use $( anywhere in your PS code or SQL will parse it as a token and tell you to escape it. My first thought was to send an email to DBAs when the process failed – or rather hadn’t happened within the last 12 hours, but opted for the throw instead. Throw raises a nice error in the job and brings it to the attention of our central monitoring system.


## Add the AMO namespace
$loadInfo = [Reflection.Assembly]::LoadWithPartialName(“Microsoft.AnalysisServices”)

## Connect and get the edition of the local server
$connection = “SERVER01”
$server = New-Object Microsoft.AnalysisServices.Server
$server.connect($connection)

$databases = $server.get_Databases()
$WF_DB = $databases.GetByName("SSASDB1")

$d = Get-Date #did this because SQL job step doesn't like dollar sign(get-date).AddHours...
$date = $d.AddHours(-12)

if ( $WF_DB.LastProcessed -lt $date)
{# Cube not updated since our threshold date

$dateprocessed = $WF_DB.LastProcessed
$lastupdate = $WF_DB.LastUpdate
$parent = $WF_DB.Parent

$message = "SSASDB1 on $parent has not processed since $date. The last process was on $dateprocessed. Manually reprocess the cube to see error message or view the previous job step for additonal information. Thanks!"
throw $message
}
else
{
echo "Cube updated after $date"

}

SQL Powershell

I’m getting a little tired of always clicking around in SSMS. Here goes a shot at powershell. I’ve done a few tutorials on powershell, but don’t use it enough to remember. The only thing I remember is Get-Child-Items and it’s alias GCI. That’s a start. Follow along with me.

User asks if a database was backed up last night. Of course it was, but let’s just make sure.

Right click on the databases node of the server I’m interested about in ssms and ‘Start PowerShell’

I end up at the databases path and try to copy the text out of the window with a mouse swipe to paste it in here. No dice, so I pop into properties and enable quick edit mode – I prefer it on in the CMD window. A highlight and a right-click will copy text out of the window.

PS SQLSERVER:\SQL\SERVER1\DEV08\Databases>

You can see the Server, instance name and then the databases node.

I enter my one command: GCI

A long list of databases fill the screen, but not the backup information I’m looking for. I know there is a powershell cmdlet (command-let a powershell command object) which will tell me more about the children of the database node – the databases. Fortunately, everyone on the web likes to talk about it. It’s the Get-Member cmdlet, which will tell you the methods and properties of whatever object you feed or ‘pipe’ it. Let’s pipe those database objects to the Get-Member alias.

GCI | GM

Whoah. That’s alot of output. I could use a little help sorting through that. Maybe I can pipe that list to another cmdlet and search for a string. There is probably a cmdlet called Find.

GCI | GM | FIND ‘backup’
FIND: Parameter format not correct

So how do I know what a cmdlet is expecting?

find /?

That gives me some help info, but changing the single quotes to double quotes doesn’t seem to do the trick.

Get-Help find

There’s a listing of somewhat unrelated commands. What did I just do there? What happens if I don’t give it a parameter.

Get-Help

TOPIC
Get-Help

SHORT DESCRIPTION
Displays help about Windows PowerShell cmdlets and concepts.

LONG DESCRIPTION

SYNTAX
get-help { | }
help { | }
-?

I expected to see a help page a bit more like this when I typed in Get-Help find, but instead I saw a listing. Find is being interpreted as a TopicName. So let’s type Get-Help find again – maybe there’s a more appropriate cmdlet I can use to find my search string in output of Get-Member. Select-String stands out to me – Finds text in strings and files.

GCI| GM | Select-String “Backup”

System.Void DropBackupHistory()
System.Data.DataTable EnumBackupSetFiles(int backupSetID), System.Data.DataTable EnumBackupSetFiles()
System.Data.DataTable EnumBackupSets()
System.Boolean IsDbBackupOperator {get;}
System.DateTime LastBackupDate {get;}
System.DateTime LastDifferentialBackupDate {get;}
System.DateTime LastLogBackupDate {get;}

There it is – LastBackupDate – and even more.

So I’ll accomplish my task at hand.

GCI | SELECT NAME, LASTBACKUPDATE, LASTLOGBACKUPDATE

But I’m only interested in a couple databases which start with “Risk”.

GCI | where { $_.name -like “Risk*”} | SELECT NAME, LASTBACKUPDATE, LASTLOGBACKUPDATE

The $_ represents the object being sent from the previous cmdlet. So you can filter on any property you found in the get-members listing.

Mission accomplished! That gives me what I need and I can feel smug about not clicking around in SSMS. True, it took me five minutes to learn how to do this and would take me a little longer than clicking on the two databases which were returned, but learning this will let me complete other more complex tasks.

Oh yeah, one other thing. Out-Host -Page will let you page through the sometimes lengthy manual pages. OH -p also works.

See all roles on your server

This cursor inserts all of the roles and members from each database into your admin database.


USE [SQL_DBA_Sandbox]
GO

IF NOT EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[dbo].[Roles]') AND type in (N'U'))
BEGIN
CREATE TABLE [dbo].[Roles](
[DBName] [varchar](max) NULL,
[Role] [varchar](max) NULL,
[Member] [varchar](max) NULL
) ON [PRIMARY]
END
ELSE
DELETE FROM SQL_DBA_Sandbox.dbo.Roles;
GO

DECLARE @DB_NAME sysname;
SET @DB_NAME = NULL;

DECLARE @SQL VARCHAR(2000);

DECLARE Cursor_DB CURSOR FAST_FORWARD READ_ONLY FOR

SELECT name FROM sys.databases
WHERE database_id > 4

OPEN Cursor_DB

FETCH NEXT FROM Cursor_DB INTO @DB_NAME
WHILE @@FETCH_STATUS = 0
BEGIN

SET @SQL =
'INSERT INTO SQL_DBA_Sandbox.dbo.Roles
( DBNAME, ROLE, MEMBER )
SELECT ''' + @DB_NAME + ''',
r.name AS ''role'',
mem.name AS ''member''
FROM [' + @DB_Name + '].sys.database_role_members rm
INNER JOIN [' + @DB_Name + '].sys.database_principals r ON r.principal_id = rm.role_principal_id
INNER JOIN [' + @DB_Name + '].sys.database_principals mem ON mem.principal_id = rm.member_principal_id';
PRINT @SQL;

EXEC (@SQL);

FETCH NEXT FROM Cursor_DB INTO @DB_NAME

END

CLOSE Cursor_DB
DEALLOCATE Cursor_DB

SELECT DISTINCT DBName,[Role] FROM SQL_DBA_Sandbox.dbo.Roles
WHERE Role = 'analyst'

SELECT * FROM SQL_DBA_Sandbox.dbo.roles
WHERE Member = 'my_user'

See all the schemas on your server

Oh no! A useful cursor?


DECLARE @DB_NAME sysname;
SET @DB_NAME = NULL;

CREATE TABLE #RESULTS
(TABLE_CATALOG sysname,
TABLE_SCHEMA sysname)

DECLARE @SQL VARCHAR(2000);

DECLARE Cursor_DB CURSOR FAST_FORWARD READ_ONLY FOR

SELECT name FROM sys.databases
WHERE database_id > 4

OPEN Cursor_DB

FETCH NEXT FROM Cursor_DB INTO @DB_NAME
WHILE @@FETCH_STATUS = 0
BEGIN

SET @SQL = 'USE ['+ @DB_NAME + '];
INSERT INTO #RESULTS
( TABLE_CATALOG, TABLE_SCHEMA )
SELECT DISTINCT TABLE_CATALOG,TABLE_SCHEMA FROM INFORMATION_SCHEMA.TABLES
UNION ALL
SELECT DISTINCT TABLE_CATALOG,TABLE_SCHEMA FROM INFORMATION_SCHEMA.VIEWS
UNION ALL
SELECT DISTINCT ROUTINE_CATALOG,ROUTINE_SCHEMA FROM INFORMATION_SCHEMA.ROUTINES;'

PRINT @SQL;

EXEC (@SQL);

FETCH NEXT FROM Cursor_DB INTO @DB_NAME

END

CLOSE Cursor_DB
DEALLOCATE Cursor_DB

SELECT DISTINCT TABLE_CATALOG, TABLE_SCHEMA FROM #RESULTS
WHERE TABLE_SCHEMA <> 'DBO'

Backups by Brent

I was happy to see that we do quite a few of the things on Brent Ozar’s Backup best practices page.

http://www.brentozar.com/archive/2009/03/dev-test-and-production-sql-server-environments/

I’ve one-upped Brent on one point. We have an SSIS package which restores databases from server to server using our DBA_REP databases which I borrowed from Rodney Landrum and modified for our environment.

One of Brent’s ideas I appreciate is his approach to keeping management apprised on restore times:

Keep management informed on restore time estimates.

Every quarter, I look at the backup sizes & speeds, and estimate how long it will take to restore a server from tape. I turn that into a written report to management, and for each server, I give cost estimates to decrease that outage window. Sometimes the answer is third-party backup compression software, sometimes it’s more CPU power for the server, sometimes it’s adding memory, etc. The primary goal is to give them enough information to make a good business decision about whether it’s worth the money. The secondary goal is to CYA: when a server is down, and a restore is taking longer than the business owners would like, then it’s a great time to pull out last quarter’s report and say, “See, I told you this would take an hour to restore, and you said it wasn’t worth the $5,000 to shrink down. Should we reconsider that decision now?”

I think we’ll be using this approach on a new project.

Enabling Network Trace in Reporting Services


I’m trying to set up reporting services 2005 to use ssl with a 2008 instance also on the same machine also using ssl.  I’m trying to use the simple method of changing the ssl port of the 2005 site.  The website’s ssl certificate seems valid when I go to the /Reports directory, but a call is going bad somewhere inside.

The underlying connection was closed: Could not establish trust relationship for the SSL/TLS secure channel.

I was directed to activate network tracing for the .net application as outlined here:

 http://msdn.microsoft.com/en-us/library/a6sbz1dx.aspx

After figuring out what version of .net the application is using, which can be found in the IIS settings for the site, activating was as simple as moving the proper xml into my web.config file for the SSRS web application.

I also added this code to direct the trace output:







Now I’m seeing  the network.log file in the reportmanager directory.  I also see some calls happening to the webservice which don’t include the 444 port number which would be necessary to make the ssl connection.

Upgrading SQL Reporting 2005

This weekend I’ll be upgrading a copy of our 2005 instance to 2008.  Upgrading the copy will let us fall back to the old version if needed.  (update: and yes, we had to, see notes below).

I opted to create a new instance on another server – apart from our clustered SQL engine. Running IIS on a SQL server is not recommended and you can run into some memory issues with IIS and SQL 2005 reports. It’s also a security issue, serving pages from your database server.

My plan was to install some 2008 components on the server – analysis, database, integration. Then I installed 2005 reporting and created a copy of the ReportServer database on this report server.

When it came time to upgrade the 2005 instance, the upgrade would not proceed. There were no components detected by the wizard to upgrade. So I uninstalled the 2008 database, moved the database to our new 2008 DB cluster and the upgrade worked. Then I cleared everything off and reinstalled 2005 with all the services. I patched it up and once more, restored the ReportServer (and of course ReportServerTempDB) to my 2008 cluster.

A quick note – if you’re planning on hosting more than one SSRS database on a server and need to change one of the database set names, you’ll need to update all the stored procedures in the ReportServer database to point to the properly named TempDB. You can script out all the procedures in ALTER form in SSMS and then run a find/replace on ReportServerTempDB. Perhaps it should be a best practice when installing a report server not to give the database the default name – use some kind of numbering scheme, so we don’t have to resort to hacky solutions when things go bad and you may have to consolidate some apps onto one server.

Another note – I also tried to attach the 2005 report server database to a 2008 report instance – it didn’t automatically upgrade it, just got an error.

Another challenge is managing the subscription jobs when setting up a new server. I wrote a script which would find all the jobs with a category of reportserver and generate some scripts to enable, disable and delete the jobs. This let me set up the new production server and then disable all the jobs until the cutover – otherwise we’re going to get duplicate reports emailed to people.  The script is below – set your output to text and run the output.

–disable
USE msdb
SELECT ‘EXEC msdb.dbo.SP_UPDATE_JOB @job_id = ”’ + CONVERT(VARCHAR(100),j.job_id) + ”’  , @enabled = 0′ FROM dbo.sysjobs_view j INNER JOIN dbo.syscategories c
ON j.category_id = c.category_id
WHERE c.NAME = ‘Report Server’
–enable
USE msdb
SELECT ‘EXEC msdb.dbo.SP_UPDATE_JOB @job_id = ”’ + CONVERT(VARCHAR(100),j.job_id) + ”’  , @enabled = 1′ FROM dbo.sysjobs_view j INNER JOIN dbo.syscategories c
ON j.category_id = c.category_id
WHERE c.NAME = ‘Report Server’
–delete
USE msdb
SELECT ‘EXEC msdb.dbo.SP_DELETE_JOB @job_id = ”’ + CONVERT(VARCHAR(100),j.job_id) + ”’ ‘ FROM dbo.sysjobs_view j INNER JOIN dbo.syscategories c
ON j.category_id = c.category_id
WHERE c.NAME = ‘Report Server’

Important notes from this install

If you want to upgrade 2005, don’t put 2008 on the box first. I’ve had it work once and fail twice.

Stopping and starting the report service in the report configuration manager will sometimes fix things.

Make sure the ssl certificate friendly name is the same as the url.  Getting 2005 set up on server 2008 needed this.

Installing reporting services 2005 on IIS 7 takes a little additional config.  You’ll need to enable some features in IIS7 and make sure to run SSRS in a classic .net app pool.  http://support.microsoft.com/kb/934164

Be sure to assign the ssl certificate to the web site, just as you would in IIS6.  If the check mark to require SSL is grayed out in SSRS configuration manager, you probably haven’t done this.

Make sure your sever can ping the url you are planning on using and that you have IIS running if you are planning on using ssl in 2008 – it is a dependency.  If you are adding ssl bindings in the 2008 SSRS config tool and they are successful, but not showing up, then something has been hosed.  This can be fixed by using the HttpSysConfig.exe tool to delete the http.sys bindings and then deleting the entries for those URLs from the report server config file.  I had this issue during our first attempt and couldn’t figure it out during our allotted down time.

The new reporting server is on a VMWare virtual 2008 server – love it!

Phew!!  Hopefully this weekend everything works out this weekend – I’m sick of Reporting Services.

HTTP 404 Not Found with Reporting Services and Windows 7 (Server 2008)

IIS reserves port 80. SQL 2008 reporting services is now using it’s own ‘web server’ – HTTP.sys.  In reporting services configuration manager change the Report Manager URL advanced settings.  In the ‘Multiple Identities for Report Manager’, edit the entry and change the TCP Port to not be 80.  Common alternates are 8080 or 8000.
http://technet.microsoft.com/en-us/library/bb630449.aspx