I was looking for a code example of this on the web, but couldn’t find it, so here it is. You would need to create the @cmd of course with your own variables.
SELECT @cmd = 'restore filelistonly from disk = ''' + @localBackupPath
+ @dbname + '\' + @filename + ''''
CREATE TABLE #files
[LogicalName] NVARCHAR(128) ,
[PhysicalName] NVARCHAR(260) ,
[Type] CHAR(1) ,
[FileGroupName] NVARCHAR(128) ,
[Size] NUMERIC(20,0) ,
[MaxSize] numeric(20,0) ,
[FileId] BIGINT ,
[CreateLSN] NUMERIC(25, 0) ,
[DropLSN] NUMERIC(25, 0) ,
[UniqueId] UNIQUEIDENTIFIER ,
[ReadOnlyLSN] NUMERIC(25, 0) ,
[ReadWriteLSN] NUMERIC(25, 0) ,
[BackupSizeInBytes] BIGINT ,
[SourceBlockSize] INT ,
[FileGroupId] INT ,
[LogGroupGUID] UNIQUEIDENTIFIER ,
[DifferentialBaseLSN] NUMERIC(25, 0) ,
[DifferentialBaseGUID] UNIQUEIDENTIFIER ,
[IsReadOnly] BIT ,
[IsPresent] BIT ,
EXEC ( @cmd
I was happy to see that we do quite a few of the things on Brent Ozar’s Backup best practices page.
I’ve one-upped Brent on one point. We have an SSIS package which restores databases from server to server using our DBA_REP databases which I borrowed from Rodney Landrum and modified for our environment.
One of Brent’s ideas I appreciate is his approach to keeping management apprised on restore times:
Keep management informed on restore time estimates.
Every quarter, I look at the backup sizes & speeds, and estimate how long it will take to restore a server from tape. I turn that into a written report to management, and for each server, I give cost estimates to decrease that outage window. Sometimes the answer is third-party backup compression software, sometimes it’s more CPU power for the server, sometimes it’s adding memory, etc. The primary goal is to give them enough information to make a good business decision about whether it’s worth the money. The secondary goal is to CYA: when a server is down, and a restore is taking longer than the business owners would like, then it’s a great time to pull out last quarter’s report and say, “See, I told you this would take an hour to restore, and you said it wasn’t worth the $5,000 to shrink down. Should we reconsider that decision now?”
I think we’ll be using this approach on a new project.
Last night while dealing with a full transaction log, it was requested that I restore a database. I generated the restore statement to a certain point in time and reviewed the script. The backup file was being retrieved from a non-standard location and when I went there, the backup file was not present! I tracked down a job on the dev server which was backing up this DB, restoring it to dev and then deleting the file! The result of this is that we effectively had no recovery available for the database since that backup was deleted. How did this happen and how can we avoid this?
It’s a frequent task in SQL to create a back for a developer to use in staging, development or on their desktop. If they can do with last nights data, you can just give them a copy of the previous day’s backup. If they need something current, you’ll need to make another backup. You’ll probably run a statement like this:
BACKUP MYDATABASE TO DISK = 'E:\BACKUPS\MYBACKUP.BAK'
This file has now become a vital part of your recovery for this database. To recover anything done past this point using transaction log backups made in the future, you’ll need this file. So you can’t just drag this file off to another server or share.
If you want to move the file around or delete it, use this command:
BACKUP MYDATABASE TO DISK = '\\SERVER_1\SHARE$\MYBACKUP.BAK' WITH COPY_ONLY
You can back up straight to a network share on another machine and save some time copying files around. This doesn’t disturb your chain of backups and log backups and lets you do as you please with the file.
More information on log sequence numbers and how they work can be found here http://msdn.microsoft.com/en-us/library/ms190729.aspx .
I’ve always been a stickler for backups, even before I was a sysadmin. I create music on my home computer, creating massive volumes of data just waiting to disappear from my life.
I have been using SuperDuper and TimeMachine on OS X. I don’t backup my windows machine, since it is a media center we mostly just stream stuff off the web.
When I leave town I throw the backup drives in an unlocked fireproof save with a note begging thieves not to steal it. This data is truly unique and irreplaceable.
But what if there was a fire when I didn’t have my drives in the safe? I’d be very sad, that’s what.
There are a few online backup services which are starting to look very attractive. Most offer 2GB for free. I’ll be needing at least 50GB for audio mixes. Actual song files are another story….maybe 300GB or so. Then there are various video, picture and music files I’d hate to lose. So it’s looking like I’ll be needing around 500GB to lock it all up.
Enter Mozy – $4.95 a month for unlimited storage space. I assume this is for one computer. What a deal. That puts a year at $60. 40 years at $2400.