SQL Express – How do I back this up?

An alarming number of third party applications come with an installation of SQL Express, and without any intention of it ever getting backed up.   I have been lucky to have never found this out the hard way, but I can envision many who have.   So what do we do about these SQL Express instances that have no SQL Agent and no simple way of backing them up?   I have a couple of approaches that we have used in the past and figured they are worth sharing.  If this does not help anyone, maybe it got you to go out and scan your LAN’s for SQL Express instances out there which would be a win on it’s own.

  • The SSIS Approach.

Assuming you have another server pretty much anywhere that has SQL Server Integration Services installed on it, you can create yourself a very simple package with very simple Execute SQL Task that runs the backup against the remote SQL Express instance.  You can follow this up with file deletion using either a Script Task to delete expired files or via the xp_delete_file command that Patrick Keisler talks about very nicely in his blog.  A simple approach and it is usually within most people’s basic tool set, since it is just using SSIS and a fairly simple package.

 

  • The Powershell Approach

With the Powershell approach, you need to have another SQL Server instance that is SQL 2008 or above (and not SQL Express), so that it can run a SQL Agent Powershell script task.  This approach allows you to specify the Server\Instance, Databasename, path you want to backup your DB to and the Path you would like to copy the file over to and because you should, it even includes running your CheckDB for you.

####set up parameters
$servername = "mysqlexpress\instancename"
$dbname = "YourDBHere"
$bkdir = "\\UNCPath\Share\Name\AndFolderHere\"
$copydir = "\\AnotherUNCPath\ToCopyTheFileTo\SoThatYouHaveRedundantBackups\"
$searchstr = "*DBName*"  
$searchext = "*.bak"
$dt = get-date -format yyyyMMddHHmmss
$bkfile = $bkdir +  $dbname + $dt + ".bak"

[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO")  | out-null
 
# create source and destination sql servers
$s = new-object ('Microsoft.SqlServer.Management.Smo.Server') $servername
# set backup directory, dbname and grab date
#create backup object and set properties
$dbbk = new-object ('Microsoft.SqlServer.Management.Smo.Backup')
$dbbk.Action = 0
$dbbk.BackupSetDescription = "Backup of " + $dbname
$dbbk.BackupSetName = $dbname + " Backup"
$dbbk.Database = $dbname
$dbbk.Devices.AddDevice($bkfile, 2)
$dbbk.MediaDescription = "Disk"
$dbbk.Incremental = $false
 
#Backup the database
$dbbk.SqlBackup($s)
Copy-Item $bkfile $copydir -force

Invoke-SQLCMD -Query "DBCC CheckDB([$dbname]) with no_infomsgs" -ServerInstance $servername

Nothing earth-shattering here, but if you ever find that you have an extra SQL Express instance out there and it is not being backed up, either of these two approaches should get that job done for you.

 

Capturing Disk Space within SQL Server – 2000 to Current

Over the years, there have been many different ways to capture disk space within SQL Server.    Some of them were a lot more “user friendly” than others, but in the end we just want to get the job done.  Here are the ways that I am aware of through the years and some of the advantages and disadvantages of each.

SQL 2000-2005

In the SQL 2000 days, there were not a ton of options for the person who wanted to run stuff out of the SQL Agent and load it to SQL Server.   You really had just a couple of things:

  • xp_fixeddrives
/*Option 1 - the easy approach that does not give you quite as much information*/
CREATE TABLE FreeSpace
(
 id INT IDENTITY(1,1) PRIMARY KEY CLUSTERED
 ,DriveLetter VARCHAR(5)
 ,FreeSpaceMB INT
 ,DateCreated DATETIME DEFAULT GETDATE()
);
GO

And then you could run a process that would load this data into your tables periodically.

INSERT INTO dbo.FreeSpace(DriveLetter, FreeSpaceMB)
EXEC xp_fixeddrives;

This is not an awful option, it is easy to set up and runs quickly and requires no real “coding” knowledge outside of basic T-SQL, but it gives you a very limited amount of information.   You only see the drives that SQL is doing something with and you only see the Free Space that is on those drives.  In many cases that is good enough but it is certainly not fantastic.

  • xp_cmdshell or command prompt options

Doing this you can get the information in a variety of ways, so I will only go through a very simple one just to give you a feel.   It is nice, in that you can find out what the total space is and what the free space is but it is a bit of a pain because of the way that you have to roll through to get the drive letters available.   If you need to have Total Space along with Free Space and you are still running SQL 2000 (which I sincerely hope you are not) then this will get it for you.

First the set up:  Create our table to store this data.

/*Option 2 - More information but a big mess.*/
CREATE TABLE DriveSpace
(
 id INT IDENTITY(1,1) PRIMARY KEY CLUSTERED
 ,DriveLetter VARCHAR(5)
 ,FreeSpaceGB INT
 ,TotalSpaceGB INT
 ,DateCreated DATETIME DEFAULT GETDATE()
);
GO

Now we go through and get all of this information.  This is a many stepped process, so let me walk you through it before putting in the fairly lengthy script.    The first thing it has to do is grab the drives that are on the server.   This is pre-powershell (note I am assuming a SQL 2000 machine) so we had to do this with wmi calls through xp_cmdshell, so standard security rules and permissions apply.  After we have grabbed those disks, we then need to go and get the space data (free and total) for them.  That is also done through xp_cmdshell and specifically via fsutil.  Once we have grabbed that information, we have to parse through it and dump it to find the relevant values so that we can finally insert them into our DriveSpace table.  Long explanation, so now here is the code and I tried to comment it in a way that would separate those parts out as well as possible.

/*Create the table to store the local drives on the server*/
IF OBJECT_ID('tempdb..#drives') IS NOT NULL 
 DROP TABLE #drives;
GO
CREATE TABLE #drives
(
 DriveName VARCHAR(200)
);
GO
/*Get the list of local drives*/
INSERT INTO #drives
EXEC xp_cmdshell 'wmic logicaldisk get description,name | findstr /C:"Local"';
GO
/*Create the table to store the holding data of the drive information. Not well formed.*/
IF OBJECT_ID('tempdb..#diskspace_hold') IS NOT NULL
 DROP TABLE #diskspace_hold;
GO
CREATE TABLE #diskspace_hold
( ColumnOutput VARCHAR(200)
);
GO
/*Same as above, but with drive associated to it.*/
IF OBJECT_ID('tempdb..#diskspace') IS NOT NULL
 DROP TABLE #diskspace;
GO
CREATE TABLE #diskspace
( Drive VARCHAR(5)
 ,ColumnOutput VARCHAR(200)
);

GO
DECLARE @drive VARCHAR(2)
 ,@cmd VARCHAR(2000);
/*First get a distinct list of the drives*/
DECLARE csr CURSOR FOR
SELECT DISTINCT LEFT(LTRIM(REPLACE(DriveName, 'Local Fixed Disk', '')), 2) AS DriveLetter
FROM #drives
WHERE DriveName LIKE 'Local Fixed Disk%'

OPEN csr
FETCH NEXT FROM csr INTO @drive

WHILE @@FETCH_STATUS = 0
BEGIN
/*For each drive grab the disk space (free and total) to load into our table for parsing*/
SET @cmd = 'fsutil volume diskfree ' + @drive
PRINT @cmd
TRUNCATE TABLE #diskspace_hold
INSERT INTO #diskspace_hold
 ( ColumnOutput )
EXEC xp_cmdshell @cmd;

/*Load into our table with the drive associated to it*/
INSERT INTO #diskspace
 ( Drive, ColumnOutput )
SELECT @drive, 
 ColumnOutput 
FROM #diskspace_hold AS dh

FETCH NEXT FROM csr INTO @drive;
END
CLOSE csr;
DEALLOCATE csr;
GO
DECLARE @drive VARCHAR(5),
 @gbfree int,
 @gbtotal int;
/*Now that we have our drives with disk space data, parse it out to a usable integer style format*/
DECLARE csr CURSOR
FOR
 SELECT DISTINCT
 d.Drive
 FROM #diskspace AS d;

OPEN csr;
FETCH NEXT FROM csr INTO @drive;

WHILE @@FETCH_STATUS = 0
BEGIN

 SELECT @gbfree = CAST(REPLACE(d.ColumnOutput, 'Total # of free bytes : ', '') AS BIGINT) / 1073741824
 FROM #diskspace AS d
 WHERE d.ColumnOutput LIKE 'Total # of free bytes%'
 AND d.Drive = @drive 

 SELECT @gbtotal = CAST(REPLACE(d.ColumnOutput, 'Total # of bytes : ', '') AS BIGINT) / 1073741824 
 FROM #diskspace AS d
 WHERE d.ColumnOutput LIKE 'Total # of bytes%'
 AND d.Drive = @drive 
 
 INSERT INTO dbo.DriveSpace
 (
 DriveLetter,
 FreeSpaceGB,
 TotalSpaceGB,
 DateCreated
 )
 SELECT 
 @drive, 
 @gbfree,
 @gbtotal,
 GETDATE();



 FETCH NEXT FROM csr INTO @drive;
END

CLOSE csr;
DEALLOCATE csr;
GO

 

SQL 2008

To begin with, the methods mentioned above will work for SQL 2008 as well.  But with SQL 2008 came the addition of Powershell as a SQL Agent step option.  This opened things up a little bit for those wanting to get disk space in a much cleaner way than what I have above.   That said, a fair number of people have put together various scripts to do this and the one that I prefer is the one that the Scripting Guy blogs about.  And even better than that, it goes through a whole lot more than just capturing disk space, it shows you how to capture data, in general, with Powershell.   It is an excellent read, and he does it a lot better than I could (especially since it is the method that I used with SQL 2008 anyway).  Plus it can cross servers, if you want a central repository.

 

SQL 2008 R2 and Higher

Once we got to SQL 2008 R2, everything got even simpler for those who just wanted to grab this data and store it locally.   I first read about this at SQLPerformance.com from a post by Erin Stellato talking about Proactive monitoring.  It just does not get easier than this.

SELECT DISTINCT
 vs.volume_mount_point AS [Drive],
 vs.logical_volume_name AS [Drive Name],
 vs.total_bytes/1024/1024 AS [Drive Size MB],
 vs.available_bytes/1024/1024 AS [Drive Free Space MB]
FROM sys.master_files AS f
CROSS APPLY sys.dm_os_volume_stats(f.database_id, f.file_id) AS vs
ORDER BY vs.volume_mount_point;

So that is my list of ways to capture drive information from within SQL Server from SQL 2000 to present.   It has morphed fairly drastically over the years and has gotten much easier over time.

 

Migrating to Azure SQL Database – System.OutofMemory

Have you ever gone through the process of importing your newly created dacpac into Azure SQL Database and after many minutes or hours of waiting, you get a failure of type

 Exception of type 'System.OutOfMemoryException' was thrown. (Microsoft.Data.Tools.Schema.Sql) 

Well, I did, and it was a very large pain to debug.  Here is where I ended up.

Initial Setup

To start with, the dacpac was originally created on a SQL Server 2012 RTM (No SP’s or CU’s applied).    The dacpac itself seemed fine, because I was able to successfully load it to my own local SQL Server 2014 instance without any problem, but for whatever reason the same would not work when pointing to our Azure SQLDB instance.  It is notable that the process was done by loading the dacpac from local storage, not from Azure Storage.

First Attempt

My first thought was to just see if I created thet dacpac from the database on my local instance (that was created from the original dacpac anyway) how things would work when coming from a dacpac created on SQL 2014.    So I did, and it worked.   That seemed very nice, but I have absolutely no love for the idea of having to

  • Create dacpac on SQL 2012 instance
  • Install dacpac on SQL 2014 instance
  • Create new dacpac with the same data from SQL 2014 instance
  • Load dacpac to Azure SQL DB

This just did not seem like fun to me, but given the scenario it could be a reasonable work around (we do not have to do this often).  However, when doing this same approach with a different and larger database (around 8 GB which is small to me in the real world but seemingly large for Azure SQL DB) the same process failed anyway.   So this was not fool proof.

Next Attempt

Since the old standard dacpac did not work in all cases, I figured I would try one of the many other solutions that came up when searching for issues of this nature.  One of the first that came up was the Azure Migration Wizard from Codeplex, so I thought I would give it a shot.   The process itself is pretty self-explanatory once you had downloaded the application and I selected the following for my parameters:

  • On the opening screen I selected “Analyze/Migrate” and chose the “Database” button.   For the Target Server I selected “Azure SQL Database”
  • Clicking Next brought me to a standard server connection item where I entered the name of my local server and database that I had previously loaded that dacpac into.
  • On the choose objects screen, I chose “Script All Database Objects”
  • I had it then generate the script and when it asked for the credentials to my Azure SQL Database I added them and connected.
  • From here the application did its thing.  Several hours later, everything completed without problem (no out of memory exception with this application, at least not in my experience so far).

This seemed to work great.    The application appears to do similar things to a bacpac, in as much as it appears to call bcp at some level underlying, appears to process the commands in a similar fashion and appears to try and do basically the same kind of stuff.   But it works and so far it has worked for me in each of my scenarios.

So if you come across this issue, this would be my recommendation in terms of trying something new.  It is easy, free and seems to work.

 

sp_who5

I wanted to share a procedure that I wrote that I use every day for checking what is active on the server.   This is not anywhere near as thorough as something like Adam Machanic’s fantastic sp_whoisactive or Brent Ozar’s sp_askBrent but to me it works better for me than sp_who or  the undocumented sp_who2.  I find it to be a very useful way to see what is running, what specific statement is running if it is a multistatement procedure.  I also find this same procedure useful for capturing data on a regular basis (every minute or so) to get data on your server with regularity and without much overhead at all.   Obviously you might be able to get some of the same things from the System Health Session or from a handful of other ways, but this will quickly show you processes that are hitting the system hard, how many threads they are using, what their wait type is, who they are blocking, and what specific statement is running.

I am betting that there are some enhancements that would make this better, so please share them if you have thoughts and feel free to use this wherever you would like.

 

USE master;
GO
IF OBJECT_ID('master..sp_who5') IS NOT NULL
	DROP PROCEDURE dbo.sp_who5;
GO
CREATE procedure sp_who5
as
SELECT er.session_Id
	, blocked
	, DB_NAME(sp.dbid) as database_name
	, er.percent_complete
	, ecid as thread
	, nt_username as [user]	, er.status as status
	, case when er.statement_start_offset > 1 and er.statement_end_offset> 1
		THEN SUBSTRING (qt.text, er.statement_start_offset/2, 
							(CASE WHEN er.statement_end_offset = -1 
									THEN LEN(CONVERT(NVARCHAR(MAX), qt.text)) * 2
									ELSE er.statement_end_offset END - er.statement_start_offset)/2) 
		ELSE qt.text end as CurrentQuery
	, er.command
	, qt.text as ParentQuery
	, program_name 
	, Hostname
	, start_time
	, wait_type 
FROM sys.dm_exec_requests er
INNER JOIN sys.sysprocesses sp ON er.session_id = sp.spid
CROSS APPLY sys.dm_exec_sql_text(er.sql_handle)as qt
WHERE er.session_id > 50    -- no system spids.
AND er.session_Id NOT IN (@@SPID)  -- remove current statements
ORDER BY er.session_id
		, blocked;