Get the last database backup for all databases in a SQL Server instance

I don’t like to publish small code snippets, but I’ve just rewritten one of my most used SQL scripts for SQL Server that gets the details about the last backup for every database (for both backup database and backup log).

It now makes use of with () and rank() over() to make it much easier to read and modify.

So I think it’s worth to share it.

with
db as (
	select [Instance] = @@SERVERNAME,
		[Database]  = name,
	    [RecoveryMode]   = DATABASEPROPERTYEX(name, 'Recovery'),
	    [CreationTime]   = crdate,
	    [Status]         = DATABASEPROPERTYEX(name, 'Status')
		from master..sysdatabases
		where name!='tempdb'
),
lastfull as	(
	select * from (
	 select [Database]     = s.database_name,
		--[Type]   = s.type,
		[LastFullDate] = convert(varchar, s.backup_finish_date, 120),
		[LastFullSize]   = s.backup_size,
		[LastFullDevice] = f.physical_device_name,
        [LastFullDevTyp] = f.device_type,
		[Nrank] = rank() over (partition by s.database_name order by s.backup_finish_date desc)
	from msdb.dbo.backupset s, msdb.dbo.backupmediafamily f
		where 
         s.media_set_id=f.media_set_id
		and s.type='D'
		-- and f.device_type = 7 -- only backup devices
	) f
	where nrank=1
),
lastlog as (
	select * from (
	 select [Database]     = s.database_name,
		--[Type]   = s.type,
		[LastLogDate] = convert(varchar, s.backup_finish_date, 120),
		[LastLogSize]   = s.backup_size,
		[LastLogDevice] = f.physical_device_name,
        [LastLogDevTyp] = f.device_type,
		[Nrank] = rank() over (partition by s.database_name order by s.backup_finish_date desc)
	 from msdb.dbo.backupset s, msdb.dbo.backupmediafamily f
		where 
         s.media_set_id=f.media_set_id
		and s.type='L'
		-- and f.device_type = 7 -- only backup devices
	) l
	where nrank=1
)
select db.[Instance],db.[Database], db.[RecoveryMode], db.[CreationTime], db.[Status],
	lastfull.[LastFullDate], lastfull.[LastFullSize], 
        lastfull.[LastFullDevice], lastfull.[LastFullDevTyp],
	lastlog.[LastLogDate], lastlog.[LastLogSize], lastlog.[LastLogDevice], lastlog.[LastLogDevTyp]
from db
	left outer join lastfull
		on (db.[Database]=lastfull.[Database])
	left outer join lastlog
		on (db.[Database]=lastlog.[Database])

As you can see, modify it to include for example incremental backups should be very easy.

Cheers

Ludo

SQLServer centralized backup monitoring with PowerShell and TSQL (2/2)

From commons.wikipedia.org. Author: włodiIn my previous post  I’ve shown how to collect data and insert it into a database table using PowerShell. Now it’s time to get some information from that data, and I’ve used TSQL for this purpose.

The backup exceptions

Every environment has some backup rules and backup exceptions. For example, you don’t want to check for failures on the model, northwind, adventureworks or distribution databases.

I’ve got the rid of this problem by using the “exception table” created in the previous post. The rules are defined by pattern matching. First we need to define a generic rule for our default backup schedules:

INSERT INTO [Tools].[dbo].[DB_Backup_Exceptions]
           ([InstanceName],[DatabaseName],[LastFullHours]
           ,[LastLogHours],[Description],[BestBefore])
     VALUES
           ('%','%',36,12,'Default delays for all databases',NULL)
GO

In the previous example, we’ll check that all databases (‘%’) on all instances (again ‘%’)  have been backed up at least every 36 hours, and a backup log have occurred in the last 12 hours. The description is useful to remember why such rule exists.

The “BestBefore” column allows to define the time limit of the rule. For example, if you do some maintenance and you are skipping some schedules, you can safely insert a rule that expires after X days, so you can avoid some alerts while avoiding also to forget to delete the rule.

INSERT INTO [Tools].[dbo].[DB_Backup_Exceptions]
           ([InstanceName],[DatabaseName],[LastFullHours]
           ,[LastLogHours],[Description],[BestBefore])
     VALUES
           ('SERVER1','%',1000000,1000000,'Maintenance until 012.05.2013', '2013-05-12 00:00:00')
GO

The previous rule will skip backup reports on SERVER1 until May 12th.

INSERT INTO [Tools].[dbo].[DB_Backup_Exceptions]
           ([InstanceName],[DatabaseName],[LastFullHours]
           ,[LastLogHours],[Description],[BestBefore])
     VALUES
           ('%','Northwind',1000000,1000000,'Don''t care about northwind',NULL)
GO

The previous rule will skip all reports on all Northwind databases.

Important: If multiple rules apply to the same database, the rule with a higher time threshold wins.

The queries

The following query lists the databases with the last backup full older than the defined threshold:

select     s.InstanceName,    s.DatabaseName,
        s.LastFull,
        floor(convert(float,GETDATE()-LastFull)*24) as LastFullAge
from db_status s, DB_Backup_Exceptions e
    where s.InstanceName like e.InstanceName and s.DatabaseName like e.DatabaseName
    and (e.bestbefore is null or e.bestbefore > GETDATE())
    and s.DatabaseStatus!='OFFLINE'
group by s.InstanceName, s.DatabaseName, s.LastFull
having LastFull < GETDATE()-(max(e.LastFullHours)/24.0)
order by s.InstanceName, s.DatabaseName
GO

And the following will do the same for the transaction logs:

select     s.InstanceName, s.DatabaseName,
         s.LastLog,
         floor(convert(float,GETDATE()-LastLog)*24)  as LastLogAge
    from db_status s, DB_Backup_Exceptions e
        where s.InstanceName like e.InstanceName and s.DatabaseName like e.DatabaseName
    and (e.bestbefore is null or e.bestbefore > GETDATE())
    and s.DatabaseStatus!='OFFLINE' and s.RecoveryMode='FULL'
    group by s.InstanceName, s.DatabaseName, s.LastLog
    having LastLog < GETDATE()-(max(e.LastLogHours)/24.0)
    order by s.InstanceName, s.DatabaseName
GO

Putting all together

Copy and paste the following to a new Transact-SQL job step in SQLAgent:

declare @tableHTML varchar(max),
        @Email varchar(255),
        @Subject varchar(255);

SET @Email = 'your_mail@yourdomain.com'
SET @Subject = 'Report backup SQL Server'

/*
######################
# REPORT BACKUP FULL #
######################
*/
set @tableHTML =
    N'<html><body><h1>Backup full not completed within the defined threshold</h1>
    <table border="1" width="100%">
      <tr><b><th>Instance</th><th>Database</th><th>LastFull</th><th>Hours since Last Full</th></tr>
'

set @tableHTML = @tableHTML + isnull(CAST((
        select
        td = s.InstanceName, '',
        td = s.DatabaseName,  '',
        td = s.LastFull,  '',
        td = floor(convert(int,convert(float,GETDATE()-LastFull)*24,0)), '
        '
from db_status s, DB_Backup_Exceptions e
    where s.InstanceName like e.InstanceName and s.DatabaseName like e.DatabaseName
    and (e.bestbefore is null or e.bestbefore > GETDATE())
    and s.DatabaseStatus!='OFFLINE'
group by s.InstanceName, s.DatabaseName, s.LastFull
having LastFull < (GETDATE()-max(e.LastFullHours)/24.0)
order by s.InstanceName, s.DatabaseName
for XML path ('tr'), TYPE) AS NVARCHAR(MAX)),'');

/*
######################
# REPORT BACKUP LOG  #
######################
*/
SET @tableHTML = @tableHTML + N'
    </table>
    <BR/><BR/>
    <h1>Backup log not completed within the defined threshold</h1>
    <table border="1" width="100%">
            <tr><b><th>Instance</th><th>Database</th><th>LastLog</th><th>Hours since Last Log</th></tr>
';

set @tableHTML = @tableHTML + isnull(CAST((
    select
        td = s.InstanceName, '',
        td = s.DatabaseName, '',
        td = s.LastLog, '',
        td = floor(convert(int,convert(float,GETDATE()-LastLog)*24,0)), '
        '
    from db_status s, DB_Backup_Exceptions e
        where s.InstanceName like e.InstanceName and s.DatabaseName like e.DatabaseName
    and (e.bestbefore is null or e.bestbefore > GETDATE())
    and s.DatabaseStatus!='OFFLINE' and s.RecoveryMode='FULL'
    group by s.InstanceName, s.DatabaseName, s.LastLog
    having LastLog < (GETDATE()-max(e.LastLogHours)/24.0)
    order by s.InstanceName, s.DatabaseName
for XML path ('tr'), TYPE) AS NVARCHAR(MAX)),'');

/*
###################
# LIST EXEPTIONS  #
###################
*/
SET @tableHTML = @tableHTML + N'
    </table>
    <BR/><BR/>
    <h1>List of exceptions currently defined</h1>
    <table border="1" width="100%">
            <tr><b><th>Instance</th><th>Database</th><th>LastFullHours</th><th>LastLogHours</th><th>Description</th><th>BestBefore</th></tr>
';

set @tableHTML = @tableHTML + CAST((
    select
        td = InstanceName, '',
        td = DatabaseName, '',
        td = LastFullHours, '',
        td = LastLogHours, '',
        td = Description, '',
        td = ISNULL(convert(varchar,bestbefore,127), 'never'), ''
    from DB_Backup_Exceptions
    where (bestbefore is null or bestbefore > GETDATE())
    order by InstanceName, DatabaseName
for XML path ('tr'), TYPE) AS NVARCHAR(MAX));

SET @tableHTML = @tableHTML + N'
    </table>
<br/><br/>
<font size="3">Information message. Please do not reply to this email.</h1></body></html>';

EXEC msdb.dbo.sp_send_dbmail
@recipients        = @Email,
@subject        = @Subject,
@body            = @tableHTML,
@body_format    = 'HTML',
@profile_name    = 'DBA'

Any comment appreciated!

Previous: SQLServer centralized backup monitoring with PowerShell and TSQL (1/2)

🙂

Ludovico

SQLServer centralized backup monitoring with PowerShell and TSQL (1/2)

U.S. Air Force: public domain imageChecking database backups has always been one of the main concerns of DBAs. With Oracle is quite easy with a central RMAN catalog, but with other databases doing it with few effort can be a great challenge.

Some years ago I developed a little framework to control all SQLServer databases. This framework was based on Linux (strange but true!), bash, freetds, sqsh and flat configuration files. It’s still doing well its work, but not all SQLServer DBAs can deal with complex bash scripting, so a customer of mines asked me if I was able to rewrite it with a language Microsoft-like.

So I decided to go for a PowerShell script in conjunction with a couple of tables for the configuration and the data, and a simple TSQL script to provide HTML reporting. I have to say, I’m not an expert on PowerShell, but it’s far from being as flexible as other programming languages (damn, comparing to perl, python or php they have in common only the initial ‘P’). However I managed to do something usable.

The principle

This is quite simple: the PowerShell script looks up for the list of instance in a reference table, then it sequentially connect to and retrieves the data:

  • recovery mode
  • status
  • creation time
  • last full backup
  • last log backup

This data is merged into a table on the central repository. Finally, a TSQL script do some reporting.

sql_centralized_backup_monitor_schema

Custom classes in powershell

One of the big messes with PowerShell is the lack of the definition for custom classes, this is a special mess if we consider that PowerShell is higly object-oriented. To define your own classes to work with, you have to define them in another language (C# in this example):

Add-Type @'

using System;

public class DatabaseBackup
{
    public string instanceName;
    public string databaseName;
    public string recoveryMode;
    public string status;
    public DateTime creationTime;
    public DateTime lastFull;
    public DateTime lastLog;

    private TimeSpan diff;

    public double lastFullTotalHours () {
		diff = DateTime.Now - lastFull;
		return Math.Round(diff.TotalHours,2);
    }

    public double lastLogTotalHours () {
		diff = DateTime.Now - lastLog;
		return Math.Round(diff.TotalHours,2);
    }

}
'@

For better code reading, I’ve put this definition in a separate file (DatabaseBackup.ps1).

The query that retrieves the data…

Actually I use this query to get the information:

$query_bck_database = "select 
  [Database]=d.name,LastFull,LastTran,GetDate=getdate(), 
  RecoveryMode=DATABASEPROPERTYEX(d.name, 'Recovery'),
  CreationTime=d.crdate, Status=DATABASEPROPERTYEX(d.name, 'Status')
from master.dbo.sysdatabases d
left outer join
 (select database_name, LastFull=max(backup_finish_date)
        from msdb.dbo.backupset
        where type = 'D' and backup_finish_date <= getdate()
        group by database_name
 ) b
on d.name = b.database_name
left outer join
 (select database_name, LastTran=max(backup_finish_date)
        from msdb.dbo.backupset
        where type ='L' and backup_finish_date <= getdate()
        group by database_name
 ) c
on d.name = c.database_name
 where d.name <> 'Tempdb'
order by [LastFull]";

I’ve also put this snippet in a separate file queries.ps1 to improve readability.

The tables

The first table (DB_Servers) can be as simple as a single column containing the instances to check. This can be any other kind of source like a corporate CMDB or similar.

The second table will contain the data collected. Off course it can be expanded!

The third table will contain some rules for managing exceptions. Such exceptions can be useful if you have situations like “all databases named northwind should not be checked”. I’ll show some examples in the next post.

CREATE TABLE [dbo].[DB_Servers] (
    [DB_Instance] [nvarchar](40) NOT NULL,
    CONSTRAINT [PK_DB_Servers] PRIMARY KEY (DB_Instance)
)
GO

CREATE TABLE [dbo].[DB_Status](
	[InstanceName] [varchar](50) NOT NULL,
	[DatabaseName] [varchar](50) NOT NULL,
	[RecoveryMode] [varchar](12) NULL,
	[DatabaseStatus] [varchar](15) NULL,
	[CreationTime] [datetime] NULL,
	[LastFull] [datetime] NULL,
	[LastLog] [datetime] NULL,
	[LastUpdate] [datetime] NULL,
    PRIMARY KEY CLUSTERED ([InstanceName] ASC,[DatabaseName] ASC)
)
GO

CREATE TABLE [dbo].[DB_Backup_Exceptions](
	[InstanceName] [varchar](50) NOT NULL,
	[DatabaseName] [varchar](50) NOT NULL,
	[LastFullHours] [int] NULL,
	[LastLogHours] [int] NULL,
	[Description] [varchar](250) NULL,
	[BestBefore] [datetime] NULL,
    PRIMARY KEY CLUSTERED ([InstanceName] ASC,[DatabaseName] ASC)
)
GO

 

The main code

Change this to whatever you want…

 set-location "K:\TOOLS\CHECK_SQL\"

### environment to insert results/get lists
$serverName = "SERVER01\MSSQL01"
$databaseName = "Tools"

This initializes the files explained earlier:

## initialise a file with some variables containing queries (to offload the script)
. .\queries.ps1

## initialise a class to better manage database backups as objects
. .\DatabaseBackup.ps1

This adds the required snap-in to query sqlserver

if ( (Get-PSSnapin -Name sqlserverprovidersnapin100 -ErrorAction SilentlyContinue) -eq $null ) {
    Add-PsSnapin sqlserverprovidersnapin100
}
if ( (Get-PSSnapin -Name sqlservercmdletsnapin100 -ErrorAction SilentlyContinue) -eq $null ) {
    Add-PsSnapin sqlservercmdletsnapin100
}

The following function will, given the instance, do the following:

  • Get the data in a ResultSet
  • Instantiate an instance of the DatabaseBackup class (the one we defined in the external file) for each row
  • Return an array of DatabaseBackup objects with all the data ready to be processed
function getDatabaseBackups ([String]$instance) {

  Write-Output "    Instance: $instance"
  $databases = invoke-sqlcmd -Query  $query_bck_database -Server $instance

  $i = 0
  foreach ( $database in $databases ) {
    $dbbck = new-object DatabaseBackup
    $dbbck.instanceName = $instance
    $dbbck.databaseName = $database.Database
    $dbbck.recoveryMode = $database.RecoveryMode
    $dbbck.creationTime = $database.CreationTime
    $dbbck.status = $database.Status
    if ( -not ( $database.IsNull("LastFull") ) ) {
      $dbbck.lastFull = $database.LastFull
    } else {
      $dbbck.lastFull = "01.01.1900 00:00:00"
    }
    if ( -not ( $database.IsNull("LastTran") ) ) {
      $dbbck.lastLog = $database.LastTran
    } else {
      $dbbck.lastLog = "01.01.1900 00:00:00"
    }

    [DatabaseBackup[]]$databasebackups += $dbbck
  }
  return $databasebackups
}

This is the real “main” of the script, connecting to the central instance and getting the list of the instances to check:

$Connection = New-Object System.Data.SQLClient.SQLConnection
$Connection.ConnectionString ="Server=$serverName;Database=$databaseName;trusted_connection=true;"
$Connection.Open()

$Command = New-Object System.Data.SQLClient.SQLCommand
$Command.Connection = $Connection

$instances = invoke-sqlcmd -Query "select [name]=db_instance from db_servers" -ServerInstance $serverName -Database $databasename

Finally, for each instance we have to check, we trigger the function that collects the data and we insert the results in the central repository (I’m using a merge to update the existent records).

foreach ( $instance in $instances ) {

  $databasebackups = getDatabaseBackups ($instance.name);

  $databasebackups[1..($databasebackups.length-1)] | foreach {

    $_ | select-object instanceName,databaseName

    $Command.CommandText = "MERGE DB_Status as target USING (
  select '$($_.instanceName )','$($_.databaseName )','$($_.recoveryMode )','$($_.status )','$($_.creationTime)','$($_.lastFull)','$($_.lastLog)')
  as source (InstanceName, DatabaseName, RecoveryMode, DatabaseStatus, CreationTime, LastFull, LastLog)
ON (source.InstanceName=target.InstanceName and source.DatabaseName=target.DatabaseName)
 WHEN MATCHED THEN
  UPDATE SET RecoveryMode = source.RecoveryMode, DatabaseStatus = source.DatabaseStatus, CreationTime = source.CreationTime,
   LastFull = source.LastFull, LastLog = source.LastLog, LastUpdate=getdate()
 WHEN NOT MATCHED THEN
  INSERT (InstanceName, DatabaseName, RecoveryMode, DatabaseStatus, CreationTime, LastFull, LastLog, LastUpdate)
   VALUES (source.InstanceName, source.DatabaseName,source.RecoveryMode,source.DatabaseStatus, source.CreationTime, source.LastFull,source.LastLog,getdate() );
"
    $Command.ExecuteNonQuery() | out-null
  }
  Remove-Variable databasebackups
}
$Connection.Close()

How to use it

  • Create the tables and insert your instances in the table db_servers.
  • Put the three files (Collect_Backup_Data.ps1,queries.ps1 and DatabaseBackup.ps1) in a directory, modify the instance name and db name in Collect_Backup_Data.ps1
  • Schedule the main script using the SQLAgent  as a Operating system (CmdExec):
powershell.exe -file "K:\TOOLS\CHECK_SQL\Collect_Backup_Data.ps1"
  • You can’t use the internal powershell of SQLServer because it’s not full compatible with powershell 2.0.
  • Check that the table db_status is getting populated

Limitations

  • The script use Windows authentication, assuming you are working with a centralized domain user. If you want to use the SQL authentication (example if you are a multi-tenant managed services provider) you need to store your passwords somewhere…
  • This script is intended to be used with single instances. It should works on clusters but I haven’t tested it.
  • Check the backup chain up to the tape library. Relying on the information contained in the msdb is not a reliable monitoring solution!!

In my next post we’ll see how to generate HTML reports via email and manage exceptions.

Hope you’ll find it useful.

Again PLEASE, if you improve it, kindly send me back a copy or blog it and post the link in the comments!

Next: SQLServer centralized backup monitoring with PowerShell and TSQL (2/2)

Cheers

Ludo

Generating graphs massively from Windows Performance Counters logs

Windows Performance Monitor is an invaluable tool when you don’t have external enterprise monitoring tools and you need to face performance problems, whether you have a web/application server, a mail server or a database server.

But what I don’t personally like of it is what you get in terms of graphing. If you schedule and collect a big amount of performance metrics you will likely get lost in adding/removing such metrics from the graphical interface.

What I’ve done long time ago (and I’ve done again recently after my old laptop has been stolen 🙁 ) is to prepare a PHP script that parse the resulting CSV file and generate automatically one graph for each metric that could be found.

Unfortunately, most of Windows Sysadmin between you will disagree that I’ve done this using a Linux Box. But I guess you can use my script if you install php inside cygwin. The other tool you need, is rrdtool, again I use it massively to resolve my graphing needs.

How to collect your data

Basically you need to create any Data Collector within the Performance Monitor that generates a log file. You can specify directly a CSV file (Log format: Comma separated) or generate a BLG file and convert it later (Log format: Binary). System dumps are not used, so if you use the standard Performace template, you can delete it from your collection.

Remember that the more counters you take, the more the graph generation will take. The script does not run in parallel, so it will use only one core. Generally:

(Time to complete) = (Num Counters) * (Num Samples) * (Speed factor)

Where (Speed factor) is depending on both the CPU speed and the disk speed because of the huge number of syncs required to update several thousands of files. I’ve tried to reduce the number of rrdupdates by queuing several update values in a single command line and I’ve noticed an important increase of performances, but I know it’s not enough.

Converting a BLG (binary) log into a CSV log

Just use the relog tool:

C:\PerfLogs\Admin\Perftest\LUDO_20130423-000002> relog "Performance Counter.blg" -f csv -o "Performance Counter.csv"

Input
----------------
File(s):
     Performance Counter.blg (Binary)

Begin:    23.4.2013 14:56:02
End:      23.4.2013 15:33:37
Samples:  452

100.00%

Output
----------------
File:     Performance Counter.csv

Begin:    23.4.2013 14:56:02
End:      23.4.2013 15:33:37
Samples:  452

The command completed successfully.

 Generating the graphs

Transfer the CSV on the box where you have the php and rrdtool configured, then run:

 

[root@lucrac01 temp]# php process_l.php PerformanceCounter.csv

453--Creating rrd /root/temp/LUDO/IPv4/Datagrams_Received_Delivered_sec.rrd

Creating rrd /root/temp/LUDO/IPv4/Datagrams_Received_Unknown_Protocol.rrd
Creating rrd /root/temp/LUDO/IPv4/Fragmented_Datagrams_sec.rrd
Creating rrd /root/temp/LUDO/IPv4/Datagrams_sec.rrd

...

Creating rrd /root/temp/LUDO/Memory/Pages_Input_sec.rrd
Creating rrd /root/temp/LUDO/Memory/Pool_Paged_Resident_Bytes.rrd
Creating rrd /root/temp/LUDO/Memory/Write_Copies_sec.rrd

...

Creating rrd /root/temp/LUDO/PhysicalDisk_2_E__/Avg._Disk_sec_Transfer.rrd
Creating rrd /root/temp/LUDO/PhysicalDisk_1_D__/Avg._Disk_sec_Transfer.rrd
Creating rrd /root/temp/LUDO/PhysicalDisk_0_C__/Avg._Disk_sec_Transfer.rrd
----......

1.Generating Graph: /root/temp/LUDO/IPv4/Datagrams_Received_Delivered_sec.png
rrdtool graph /root/temp/LUDO/IPv4/Datagrams_Received_Delivered_sec.png --start "1366721762" --end "1366724017" --width 453 DEF:ds0=/root/temp/LUDO/IPv4/Datagrams_Received_Delivered_sec.rrd:value:LAST:step=5 LINE1:ds0#0000FF:"IPv4\Datagrams Received Delivered/sec" VDEF:ds0max=ds0,MAXIMUM VDEF:ds0avg=ds0,AVERAGE VDEF:ds0min=ds0,MINIMUM COMMENT:" " COMMENT:" Maximum " GPRINT:ds0max:"%6.2lf" COMMENT:" Average " GPRINT:ds0avg:"%6.2lf" COMMENT:" Minimum " GPRINT:ds0min:"%6.2lf"
534x177
2.Generating Graph: /root/temp/LUDO/IPv4/Datagrams_Received_Unknown_Protocol.png
rrdtool graph /root/temp/LUDO/IPv4/Datagrams_Received_Unknown_Protocol.png --start "1366721762" --end "1366724017" --width 453 DEF:ds0=/root/temp/LUDO/IPv4/Datagrams_Received_Unknown_Protocol.rrd:value:LAST:step=5 LINE1:ds0#0000FF:"IPv4\Datagrams Received Unknown Protocol" VDEF:ds0max=ds0,MAXIMUM VDEF:ds0avg=ds0,AVERAGE VDEF:ds0min=ds0,MINIMUM COMMENT:" " COMMENT:" Maximum " GPRINT:ds0max:"%6.2lf" COMMENT:" Average " GPRINT:ds0avg:"%6.2lf" COMMENT:" Minimum " GPRINT:ds0min:"%6.2lf"
534x177
...

generated_graphs

Now it’s done! 

The script generate a folder with the name of the server (LUDO in my example) and a subfolder for each class of counters (as you see in Performance Monitor).

Inside each folder you will have a PNG (and an rrd) for each metric.

 

generated_graph_cpu

 

Important: The RRD are generated with a single round-robin archive with a size equal to the number of samples. If you want to have the rrd to store your historical data you’ll need to modify the script. Also, the size of the graph will be the same as the number of samples (for best reading), but limited to 1000 to avoid huge images.

Future Improvements

Would be nice to have a prepared set of graphs for standard graphs with multiple metrics (e.g. CPU user, system and idle together) and additional lines like regressions…

Download the script: process_l_php.txt and rename it with a .php extension.

Hope you’ll find it useful!

Cheers

Ludo

SQLServer 2008R2 unattended installation and configuration via powershell

My first steps on Powershell

Ok, Ok, as an “Oracle on Linux Certified Expert”, I’ve never been a great fan of SQLServer (I shouldn’t say this, I’m working on SQLServer since release 6.5…) and I’ve always hated the DOS command prompt.
However, things are changing fast after Microsoft released the Powershell some years ago. It’s surprising, now Windows powershell support new best of breed features like aliases and pipelines. 😀

Today Microsoft itself recommends Windows Core installations instead of the full ones, and also SQLServer 2012 comes with a lot of new Commandlets to manage your server.

So I’ve decided to move my first steps in the Powershell world and I’ve created a script for a customer that installs and configure a SQL2008 with a single Powershell script.

It has been quite painful to complete, you can download the complete script HERE.

Taking parameters

The very first line accepts named parameters. I’ve tried to reduce the number but I’ve preferred to take, as an example, different disks for different tasks.

param ([string] $instance, [string] $collation, $ServiceAccountPassword, $saPassword, $sourceDir, $servicePackExec, $instDrive, $userDbDrive, $userLogDrive, $tempDbDrive,  $tempLogDrive, $backupDrive , $port )

Then I’ve put a little of interaction if some parameters are missing. In facts, I can launch my scripts without inline parameters and specify everything when prompted by the script.

######################
# Getting Parameters #
######################
if ( -not $sourceDir) { $sourceDir = Read-Host 'Source Path containing SQL2008R2 installation ? ' }
if ( -not $servicePackExec) { $servicePackExec = Read-Host 'Full Path to service pack executable [Empty for None]? ' }
if ( -not $instance) { $instance = Read-Host 'Instance Name? ' }
if ( -not $collation ) {$collation = Read-Host 'Collation [Latin1_General_CI_AS]? ' }
if ( -not $collation ) { $collation = "Latin1_General_CI_AS" }

if ( -not $port ) { $port = Read-Host 'TCP Port [50000]? ' }
if ( -not $port ) { $port = "50000" }

if ( -not $ServiceAccountPassword ) {
    [System.Security.SecureString]$ServiceAccountPassword = Read-Host "Enter the service account password: " -AsSecureString;
    [String]$syncSvcAccountPassword = [Runtime.InteropServices.Marshal]::PtrToStringAuto([Runtime.InteropServices.Marshal]::SecureStringToBSTR($ServiceAccountPassword));
} else { [String]$syncSvcAccountPassword = $ServiceAccountPassword; }

if ( -not $saPassword ) {
    [System.Security.SecureString]$saPasswordSec = Read-Host "Enter the sa password: " -AsSecureString;
    [String]$saPassword = [Runtime.InteropServices.Marshal]::PtrToStringAuto([Runtime.InteropServices.Marshal]::SecureStringToBSTR($saPasswordSec));
} else { [String]$saPassword = $ServiceAccountPassword; }

$instance = $instance.ToUpper()

Before I prompt for the drive letters for the installation paths, I display a little table with the available local disks:

function displayDrives () {
    Get-WmiObject -class "Win32_LogicalDisk" | ?{ @(2, 3) -contains $_.DriveType } | where {$_.Freespace} | select Name, VolumeName, Size, FreeSpace
}

########################
# Getting Disk Letters #
########################

$driveDisplayed=0

if ( -not $instDrive ) {
    if ( $driveDisplayed -eq 0 ) { displayDrives ; $driveDisplayed=1 }
    $instDrive = Read-Host 'Drive letter (without colon) for Instance Installation? '
}

if ( -not $userDbDrive ) {
    if ( $driveDisplayed -eq 0 ) { displayDrives ; $driveDisplayed=1 }
    $userDbDrive = Read-Host 'Drive letter (without colon) for User Databases? '
}

if ( -not $userLogDrive ) {
    if ( $driveDisplayed -eq 0 ) { displayDrives ; $driveDisplayed=1 }
    $userLogDrive = Read-Host 'Drive letter (without colon) for Transaction Logs ? '
}

if ( -not $tempDbDrive ) {
    if ( $driveDisplayed -eq 0 ) { displayDrives ; $driveDisplayed=1 }
    $tempDbDrive = Read-Host 'Drive letter (without colon) for Temp Database ? '
}

if ( -not $tempLogDrive ) {
    if ( $driveDisplayed -eq 0 ) { displayDrives ; $driveDisplayed=1 }
    $tempLogDrive = Read-Host 'Drive letter (without colon) for Temp Logs ? '
}

if ( -not $backupDrive ) {
    if ( $driveDisplayed -eq 0 ) { displayDrives ; $driveDisplayed=1 }
    $backupDrive = Read-Host 'Drive letter (without colon) for Backups ? '
}

 

 Installing prerequisites

###############################
# install prerequisites for SQL2008R2
function installPrereqs () {
    Import-Module ServerManager
    Add-WindowsFeature Application-Server,AS-NET-Framework,NET-Framework,NET-Framework-Core,WAS,WAS-Process-Model,WAS-NET-Environment,WAS-Config-APIs
    # get-windowsfeature | Where {$_.Installed} | Sort FeatureType,Parent,Name | Select Name,Displayname,FeatureType,Parent
}

######################
# Installing Prereqs #
######################
"Installing Prerequisites (.Net, etc) ..."
installPrereqs

The commented command is to get the installed features after the installation. No really need to display it, it works really well.

 Dynamically prepare a configuration file

The unattended installation needs some parameters prepared in a configuration file.
This is likely where you will change most of your stuff depending on your standards:
Components, paths, service accounts, you can change everything or modify the script to accept also this variables as parameters.

The full documentation about filling the configuration file is on the MSDN:

http://msdn.microsoft.com/en-us/library/ms144259%28v=sql.105%29.aspx

#########################################
# prepare the standard configuration file
function prepareConfigFile ([String]$instance, [String]$collation, $instDrive, $userDbDrive, $userLogDrive, $tempDbDrive, $tempLogDrive, $backupDrive ) {
$config = "[SQLSERVER2008]
ACTION=""Install""
FEATURES=SQLENGINE,REPLICATION,FULLTEXT,BIDS,CONN,IS,BC,BOL,SSMS,ADV_SSMS,SNAC_SDK
INSTANCENAME=""$instance""
INSTANCEID=""$instance""
INSTALLSHAREDDIR=""C:\Program Files\Microsoft SQL Server""
INSTALLSHAREDWOWDIR=""C:\Program Files (x86)\Microsoft SQL Server""
INSTANCEDIR=""C:\Program Files\Microsoft SQL Server""
INSTALLSQLDATADIR="""+$instDrive+":\MSSQL\$instance""
SQLUSERDBDIR="""+$userDbDrive+":\MSSQL\$instance\MSSQL10_50."+$instance+"\MSSQL\Data""
SQLUSERDBLOGDIR="""+$userLogDrive+":\MSSQL\$instance\MSSQL10_50."+$instance+"\MSSQL\Tlog""
SQLTEMPDBDIR="""+$tempDbDrive+":\MSSQL\$instance\MSSQL10_50."+$instance+"\MSSQL\Data""
SQLTEMPDBLOGDIR="""+$tempLogDrive+":\MSSQL\$instance\MSSQL10_50."+$instance+"\MSSQL\Tlog""
SQLBACKUPDIR="""+$backupDrive+":\MSSQL\$instance\MSSQL10_50."+$instance+"\MSSQL\Backup""
FILESTREAMLEVEL=""0""
TCPENABLED=""1""
NPENABLED=""1""
SQLCOLLATION=""$collation""
SQLSVCACCOUNT=""MYDOM\sqlsrvc""
SQLSVCSTARTUPTYPE=""Automatic""
AGTSVCACCOUNT=""MYDOM\sqlsrvc""
AGTSVCSTARTUPTYPE=""Automatic""
ISSVCACCOUNT=""NT AUTHORITY\NetworkService""
ISSVCSTARTUPTYPE=""Automatic""
BROWSERSVCSTARTUPTYPE=""Automatic""
SQLSYSADMINACCOUNTS=""MYDOM\sqlsrvc""
SECURITYMODE=""SQL""
SQMREPORTING=""False""
IACCEPTSQLSERVERLICENSETERMS=""True"""

$config
}

#####################
# Creating Ini File #
#####################
$workDir = pwd

"Creating Ini File for Installation..."
$configFile = "$workDir\sql2008_"+$instance+"_install.ini"

prepareConfigFile $instance $collation $instDrive $userDbDrive $userLogDrive $tempDbDrive $tempLogDrive $backupDrive | Out-File $configFile

"Configuration File written to: "+$configFile

Starting the SQL Server 2008 R2 installation

Off course you’ll need an installation media downloaded from the Microsoft customers site with the correct License Keys and mounted somewhere.  (remember the $sourceDir parameter?) I’ve decided to change the path in the directory containing the media and then change it back.

#######################################
# Starting SQL 2008 Base Installation #
#######################################

set-location $sourceDir

"Starting SQL 2008 Base Installation..."
$installCmd = ".\setup.exe /qs /SQLSVCPASSWORD=""$syncSvcAccountPassword"" /AGTSVCPASSWORD=""$syncSvcAccountPassword"" /SAPWD=""$saPassword"" /ConfigurationFile=""$configFile"""
$installCmd

Invoke-Expression $installCmd

set-location $workDir

 

Launching the Service Pack installation

The Service Pack installation has been a little more painful, normally would be simple but actually the powershell prompt is returned immediately after firing the command. So, to wait it, I’ve had to figure out the name of the process (is the executable name without the file extension .exe), get its process id and wait for that process:

#######################################
# Starting SQL 2008 SP Installation #
#######################################
if ( $servicePackExec) {
    "Starting Service Pack Installation..."
    $patchCmd = "$servicePackExec /Action=Patch /Quiet /IAcceptSQLServerLicenseTerms /Instancename=""$Instance"""
    $patchCmd
    Invoke-Expression $patchCmd

    ## have to take the name of the process and wait for the completion of the pid because service packs
    ## return prompt immediately and then run in background
    $process=[System.IO.Path]::GetFileNameWithoutExtension($servicePackExec)
    $nid = (Get-Process $process).id
    Wait-Process -id $nid
}

 Changing the TCP port

By default SQLServer starts listening on a dynamic port. If you have a default and you want to configure it without opening the configuration manager, you can do it with this snipplet that I’ve copied from sirSql (thank you for sharing this).

 ###############################
# change the TCP port at the end of the installation
function changePort ($SQLName , $Instance, $port) {
    Try
        {

    $SQLName
    $Instance

    # Load SMO Wmi.ManagedComputer assembly
    [System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SqlWmiManagement") | out-null

    Trap {
        $err = $_.Exception
        while ( $err.InnerException ) {
            $err = $err.InnerException
            write-output $err.Message
        }
        continue
    }

    # Connect to the instance using SMO
    $m = New-Object ('Microsoft.SqlServer.Management.Smo.Wmi.ManagedComputer') $SQLName
    $urn = "ManagedComputer[@Name='$SQLName']/ServerInstance[@Name='$Instance']/ServerProtocol[@Name='Tcp']"
    $Tcp = $m.GetSmoObject($urn)
    $Enabled = $Tcp.IsEnabled
    #Enable TCP/IP if not enabled
    IF (!$Enabled)
        {$Tcp.IsEnabled = $true }

    #Set to listen on 50000 and disable dynamic ports
    $m.GetSmoObject($urn + "/IPAddress[@Name='IPAll']").IPAddressProperties[1].Value = $port
    $m.GetSmoObject($urn + "/IPAddress[@Name='IPAll']").IPAddressProperties['TcpDynamicPorts'].Value = ''
    $TCP.alter()

        "Success: SQL set to listen on TCP/IP port $port. Please restart the SQL service for changes to take effect."
    }
    Catch { Write-Warning "Unable to enable TCP/IP & set SQL to listen on port $port" }
}

####################
# Changing TCPport #
####################

"Changing TCP port to $port..."
changePort $hostName $instance $port

Adding your stuff at the end

Having the installation completed is in midstream. After the installation you may want to add tempfiles to your installation, modify your model database, add default accounts.

That’s up to you. If your scripts are identical you can execute them with sqlcmd.If you want to take benefit of the variables already set in the script you can execute them directly:

###############################
# Resizing / Adding Tempfiles #
###############################

$Connection = New-Object System.Data.SQLClient.SQLConnection
$hostName = get-content env:computername

$Connection.ConnectionString ="Server=$hostName\$instance;Database=master;uid=sa;Pwd=$saPassword;"
$Connection.Open()

$Command = New-Object System.Data.SQLClient.SQLCommand
$Command.Connection = $Connection

$Command.CommandText = "ALTER DATABASE tempdb MODIFY FILE (NAME = tempdev, SIZE = 1024MB, filegrowth = 64MB, maxsize=unlimited);"
$Command.ExecuteNonQuery() | out-null
$Command.CommandText = "ALTER DATABASE tempdb MODIFY FILE (NAME = templog, SIZE = 512MB, filegrowth = 64MB, maxsize=unlimited);"
$Command.ExecuteNonQuery() | out-null

$Command.CommandText = "ALTER DATABASE tempdb ADD FILE (NAME = tempdev2, FILENAME = '"+$tempDrive+":\MSSQL\$instance\MSSQL10_50.$instance\MSSQL\Data\tempdb2.ndf', SIZE = 1024MB, filegrowth = 64MB, maxsize=unlimited);"
$Command.ExecuteNonQuery() | out-null
$Command.CommandText = "ALTER DATABASE tempdb ADD FILE (NAME = tempdev3, FILENAME = '"+$tempDrive+":\MSSQL\$instance\MSSQL10_50.$instance\MSSQL\Data\tempdb3.ndf', SIZE = 1024MB, filegrowth = 64MB, maxsize=unlimited);"
$Command.ExecuteNonQuery() | out-null
$Command.CommandText = "ALTER DATABASE tempdb ADD FILE (NAME = tempdev4, FILENAME = '"+$tempDrive+":\MSSQL\$instance\MSSQL10_50.$instance\MSSQL\Data\tempdb4.ndf', SIZE = 1024MB, filegrowth = 64MB, maxsize=unlimited);"
$Command.ExecuteNonQuery() | out-null

$Connection.Close()

Putting all together…

Well, I’ll never paste again all the content here, you can download the script HERE. Just change the file extension from .txt to .ps1.

I know it’s not a rock-solid procedure but it works well for my purposes, feel free to comment or review my script. Just, if you do some improvement on it, please share it and back-link this post!

Cheers