I’ve taken my MySQL 5.0 OCP certification in 2010. A little late because this certification has been around for longtime, but surprisingly, it’s still the only version for MySQL OCP certification.
The 5.0 branch started in 2003 and it went GA in 2005:
“MySQL 5.0.0, a new version of the popular Open Source/Free SoftwareDatabase Management System, has been released.”
Michael Widenius, December 24 2003
“It is my pleasure to announce the production release of MySQL 5.0, which is hereby GA (Generally Available).”
Kaj Arnö, October 24th 2005
If we make a comparison, it’s like if the certification release for Oracle was the 10gR1.
Yesterday I’ve received an email from Oracle University claiming new courses on MySQL 5.6 to add certifications to the resume:
“The MySQL for Database Administrators course is for DBAs and other database professionals who want to install and configure MySQL Server, set up replication and security, perform database backups and performance tuning, and protect MySQL databases. This course has been updated to cover commercial features of MySQL as well as enhanced replication capabilities in MySQL 5.6.”
Oracle University, May 20th 2013
I’ve started thinking that a new certification path on 5.6 was available, but on the Oracle site I see that the certification is always the same. They have just added a sentence in the preparation notes:
“This exam has been validated through MySQL 5.5. The recommended course is based on MySQL 5.6. Candidates wishing to use this course to prepare may wish to consult product documentation to ensure they understand the differences between MySQL 5.5 and MySQL 5.6”
In my previous post I’ve shown how to collect data and insert it into a database table using PowerShell. Now it’s time to get some information from that data, and I’ve used TSQL for this purpose.
The backup exceptions
Every environment has some backup rules and backup exceptions. For example, you don’t want to check for failures on the model, northwind, adventureworks or distribution databases.
I’ve got the rid of this problem by using the “exception table” created in the previous post. The rules are defined by pattern matching. First we need to define a generic rule for our default backup schedules:
PgSQL
1
2
3
4
5
6
INSERTINTO[Tools].[dbo].[DB_Backup_Exceptions]
([InstanceName],[DatabaseName],[LastFullHours]
,[LastLogHours],[Description],[BestBefore])
VALUES
('%','%',36,12,'Default delays for all databases',NULL)
GO
In the previous example, we’ll check that all databases (‘%’) on all instances (again ‘%’) have been backed up at least every 36 hours, and a backup log have occurred in the last 12 hours. The description is useful to remember why such rule exists.
The “BestBefore” column allows to define the time limit of the rule. For example, if you do some maintenance and you are skipping some schedules, you can safely insert a rule that expires after X days, so you can avoid some alerts while avoiding also to forget to delete the rule.
PgSQL
1
2
3
4
5
6
INSERTINTO[Tools].[dbo].[DB_Backup_Exceptions]
([InstanceName],[DatabaseName],[LastFullHours]
,[LastLogHours],[Description],[BestBefore])
VALUES
('SERVER1','%',1000000,1000000,'Maintenance until 012.05.2013','2013-05-12 00:00:00')
GO
The previous rule will skip backup reports on SERVER1 until May 12th.
Oracle PL/SQL
1
2
3
4
5
6
INSERTINTO[Tools].[dbo].[DB_Backup_Exceptions]
([InstanceName],[DatabaseName],[LastFullHours]
,[LastLogHours],[Description],[BestBefore])
VALUES
('%','Northwind',1000000,1000000,'Don''t care about northwind',NULL)
GO
The previous rule will skip all reports on all Northwind databases.
Important: If multiple rules apply to the same database, the rule with a higher time threshold wins.
The queries
The following query lists the databases with the last backup full older than the defined threshold:
Checking database backups has always been one of the main concerns of DBAs. With Oracle is quite easy with a central RMAN catalog, but with other databases doing it with few effort can be a great challenge.
Some years ago I developed a little framework to control all SQLServer databases. This framework was based on Linux (strange but true!), bash, freetds, sqsh and flat configuration files. It’s still doing well its work, but not all SQLServer DBAs can deal with complex bash scripting, so a customer of mines asked me if I was able to rewrite it with a language Microsoft-like.
So I decided to go for a PowerShell script in conjunction with a couple of tables for the configuration and the data, and a simple TSQL script to provide HTML reporting. I have to say, I’m not an expert on PowerShell, but it’s far from being as flexible as other programming languages (damn, comparing to perl, python or php they have in common only the initial ‘P’). However I managed to do something usable.
The principle
This is quite simple: the PowerShell script looks up for the list of instance in a reference table, then it sequentially connect to and retrieves the data:
recovery mode
status
creation time
last full backup
last log backup
This data is merged into a table on the central repository. Finally, a TSQL script do some reporting.
Custom classes in powershell
One of the big messes with PowerShell is the lack of the definition for custom classes, this is a special mess if we consider that PowerShell is higly object-oriented. To define your own classes to work with, you have to define them in another language (C# in this example):
PowerShell
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
Add-Type@'
using System;
public class DatabaseBackup
{
public string instanceName;
public string databaseName;
public string recoveryMode;
public string status;
public DateTime creationTime;
public DateTime lastFull;
public DateTime lastLog;
private TimeSpan diff;
public double lastFullTotalHours () {
diff = DateTime.Now - lastFull;
return Math.Round(diff.TotalHours,2);
}
public double lastLogTotalHours () {
diff = DateTime.Now - lastLog;
return Math.Round(diff.TotalHours,2);
}
}
'@
For better code reading, I’ve put this definition in a separate file (DatabaseBackup.ps1).
where type ='L' and backup_finish_date <= getdate()
group by database_name
) c
on d.name = c.database_name
where d.name <> 'Tempdb'
order by [LastFull]";
I’ve also put this snippet in a separate file queries.ps1 to improve readability.
The tables
The first table (DB_Servers) can be as simple as a single column containing the instances to check. This can be any other kind of source like a corporate CMDB or similar.
The second table will contain the data collected. Off course it can be expanded!
The third table will contain some rules for managing exceptions. Such exceptions can be useful if you have situations like “all databases named northwind should not be checked”. I’ll show some examples in the next post.
$instances=invoke-sqlcmd-Query"select [name]=db_instance from db_servers"-ServerInstance$serverName-Database$databasename
Finally, for each instance we have to check, we trigger the function that collects the data and we insert the results in the central repository (I’m using a merge to update the existent records).
You can’t use the internal powershell of SQLServer because it’s not full compatible with powershell 2.0.
Check that the table db_status is getting populated
Limitations
The script use Windows authentication, assuming you are working with a centralized domain user. If you want to use the SQL authentication (example if you are a multi-tenant managed services provider) you need to store your passwords somewhere…
This script is intended to be used with single instances. It should works on clusters but I haven’t tested it.
Check the backup chain up to the tape library. Relying on the information contained in the msdb is not a reliable monitoring solution!!
In my next post we’ll see how to generate HTML reports via email and manage exceptions.
Hope you’ll find it useful.
Again PLEASE, if you improve it, kindly send me back a copy or blog it and post the link in the comments!
This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish.Accept
Privacy & Cookies Policy
Privacy Overview
This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.