Wednesday, 3 February 2010

SQL Server Stats and Query Optimiser tips

Great article here with 13 Things You Should Know About Statistics and Query Optimiser. Covers a lot of ground and is worth a read.

Tuesday, 19 January 2010

SQL Plan Parallelism issue

Had a strange scenario whereby a stored procedure would randomly execute very slowly. Normally took half a second, then occasionally took 30 seconds or more. Thought initially as this was on busy tables (i.e. results varied considerably even though the SQL was static with no parameters) that the plan created was incorrectly optimised. However, straight SQL provided the same issue. WITH RECOMPILE seemed to work around the problem, however that added a couple of seconds to the execution time.

So I decided to dig a little deeper into the plan, and noticed that there was a Parallelism item for just about every SQL. Even stranger, I found that if I removed a particular column from the select list (that happened to be a DECIMAL 20,9) that the parallelism went and the SQL flew.

Now I'm not sure if there is any more work required when selecting a DECIMAL field, however it was definitely related. Remove any other field and the parallelism (and the performance) didn't change.

Finally settled on stopping SQL from pursueing parallelism by adding the following after the end of the SELECT statement :

OPTION ( MAXDOP 1 )


This forces the SQL to be run on one processor, and lo the performance is now back to less that half a second.

Not maybe an ideal solution as I'm telling SQL that it is wrong. However, looking at the plan, all the estimated costs are well off so I can only assume that this is some weird set of circumstances.

Monday, 18 January 2010

Detecting blocking SQL 2005

I've got a very occasional problem whereby a process is blocking the incremental rebuild of some fulltext indexes which in turn blocks loads of other statements. Trapping this in a reasonable amount of time so that it doesn't affect users too much has been a problem. I then found this Tony Rogerson article which went a great way to implementing a solution.

Repeating Tony's article (look there for the full details, this is just the code)->

First create the trace threshold. Show advance options needs to be enabled:

sp_configure 'show advanced options', 1
go
reconfigure
go
sp_configure
go

-- Set up report capture threshold. Set to 10 seconds here - change figure
-- to suit needs.
sp_configure 'blocked process threshold', 10
go
reconfigure
go


Next create a database to receive the messages:

--CREATE DATABASE RBT.DBAEventManagement
--go

USE [RBT.DBAEventManagement]
go

ALTER DATABASE [RBT.DBAEventManagement] SET ENABLE_BROKER
go

CREATE QUEUE syseventqueue
go

CREATE SERVICE syseventservice
ON QUEUE syseventqueue ( [http://schemas.microsoft.com/SQL/Notifications/PostEventNotification] )
go

CREATE EVENT NOTIFICATION notify_locks
ON SERVER
WITH fan_in
FOR blocked_process_report
TO SERVICE 'syseventservice', 'current database';

GO


This now enough to capture the messages and queue them up. The next part is to receive the messages and to email out/store history. Only 1 email is sent per block - don't want an email every 10 seconds ! The history table is as follows

USE [RBT.DBAEventManagement]
GO

IF OBJECT_ID('dbo.tbl_messages', 'U') IS NOT NULL
DROP TABLE dbo.tbl_messages
GO

CREATE TABLE dbo.tbl_messages
(
Message_Body XML NOT NULL,
Message_Sequence_Number INT NOT NULL,
xactid VARCHAR(20) NOT NULL,
Emailed VARCHAR(1) NOT NULL CONSTRAINT [DF_tbl_messages_Emailed] DEFAULT 'N',
TMStamp SMALLDATETIME NOT NULL CONSTRAINT [DF_tbl_messages_TMStamp] DEFAULT (GETDATE()),
CONSTRAINT PK_messages PRIMARY KEY NONCLUSTERED(TMStamp ASC, Message_Sequence_Number ASC)
)

GO


The stored procedure to receive the messages is as follows:

CREATE PROCEDURE [dbo].[proc_capture_queue]
AS


/* ========================================================================
proc_capture_queue

Capture service broker queue and insert into history table.
Email new message

===========================================================================
Version Date Author Comment
------- ---- ------ -------
v0.00 14/01/2010 Brian Jones Initial Version
=========================================================================== */
SET NOCOUNT ON

DECLARE @msg_body XML,
@DatabaseID INT,
@Process XML,
@xactid VARCHAR(20),
@emailBody NVARCHAR(MAX),
@13LineFeed10 NVARCHAR(2)

DECLARE @msgs TABLE ( message_body XML NOT NULL,
message_sequence_number INT NOT NULL,
xactid VARCHAR(20))

SET @13LineFeed10 = CHAR(13) + CHAR(10)

IF NOT EXISTS (SELECT * FROM sys.server_event_notifications)
BEGIN
EXEC msdb.dbo.sp_send_dbmail @profile_name = 'MailProfile',
@recipients = N'myemail@myaddress',
@subject = 'WARNING: SQL Blocking Service Queue',
@body = 'The SQL message queue for tracking blocking sessions is missing, please investigate.'
RETURN
END;

RECEIVE message_body, message_sequence_number,
CAST( message_body AS XML ).value( '/EVENT_INSTANCE[1]/TextData[1]/blocked-process-report[1]/blocked-process[1]/process[1]/@xactid', 'varchar(20)')
FROM syseventqueue
INTO @msgs;

INSERT INTO dbo.tbl_messages(Message_Body, Message_Sequence_Number, xactid)
SELECT Message_Body, Message_Sequence_Number, xactid
FROM @msgs
ORDER BY message_sequence_number

DECLARE email_cur CURSOR FOR
SELECT Message_Body, xactid
FROM @msgs
ORDER BY message_sequence_number

OPEN email_cur
FETCH NEXT FROM email_cur INTO @msg_body, @xactid

WHILE @@FETCH_STATUS = 0
BEGIN
IF NOT EXISTS(SELECT * FROM tbl_messages WHERE xactid = @xactid AND Emailed = 'Y')
BEGIN
SELECT @DatabaseId = CAST( @msg_body AS XML ).value( '(/EVENT_INSTANCE/DatabaseID)[1]', 'int' )

SELECT @emailBody = N'The following process is being blocked.' + @13LineFeed10 + @13LineFeed10 +
'Blocked Process ' + @13LineFeed10 +
'=============== ' + @13LineFeed10 +
'Spid : ' + CAST( @msg_body AS XML ).value( '/EVENT_INSTANCE[1]/TextData[1]/blocked-process-report[1]/blocked-process[1]/process[1]/@spid', 'varchar(10)')
+ @13LineFeed10 +
'HostPid : ' + CAST( @msg_body AS XML ).value( '/EVENT_INSTANCE[1]/TextData[1]/blocked-process-report[1]/blocked-process[1]/process[1]/@hostpid', 'varchar(10)')
+ @13LineFeed10 +
'HostName : ' + CAST( @msg_body AS XML ).value( '/EVENT_INSTANCE[1]/TextData[1]/blocked-process-report[1]/blocked-process[1]/process[1]/@hostname', 'varchar(100)')
+ @13LineFeed10 +
'LoginName : ' + CAST( @msg_body AS XML ).value( '/EVENT_INSTANCE[1]/TextData[1]/blocked-process-report[1]/blocked-process[1]/process[1]/@loginname', 'varchar(100)')
+ @13LineFeed10 +
'DatabaseName : ' + DB_Name(@DatabaseID)
+ @13LineFeed10 +
'Batch Started : ' + CAST( @msg_body AS XML ).value( '/EVENT_INSTANCE[1]/TextData[1]/blocked-process-report[1]/blocked-process[1]/process[1]/@lastbatchstarted', 'varchar(25)')
+ @13LineFeed10 +
+ @13LineFeed10 +
'Blocking Process ' + @13LineFeed10 +
'================ ' + @13LineFeed10 +
'Spid : ' + CAST( @msg_body AS XML ).value( '/EVENT_INSTANCE[1]/TextData[1]/blocked-process-report[1]/blocking-process[1]/process[1]/@spid', 'varchar(10)')
+ @13LineFeed10 +
'HostPid : ' + CAST( @msg_body AS XML ).value( '/EVENT_INSTANCE[1]/TextData[1]/blocked-process-report[1]/blocking-process[1]/process[1]/@hostpid', 'varchar(10)')
+ @13LineFeed10 +
'HostName : ' + CAST( @msg_body AS XML ).value( '/EVENT_INSTANCE[1]/TextData[1]/blocked-process-report[1]/blocking-process[1]/process[1]/@hostname', 'varchar(100)')
+ @13LineFeed10 +
'LoginName : ' + CAST( @msg_body AS XML ).value( '/EVENT_INSTANCE[1]/TextData[1]/blocked-process-report[1]/blocking-process[1]/process[1]/@loginname', 'varchar(100)')
+ @13LineFeed10 +
'DatabaseName : ' + DB_Name(@DatabaseID)
+ @13LineFeed10 +
'Batch Started : ' + CAST( @msg_body AS XML ).value( '/EVENT_INSTANCE[1]/TextData[1]/blocked-process-report[1]/blocking-process[1]/process[1]/@lastbatchstarted', 'varchar(25)')
+ @13LineFeed10 +
+ @13LineFeed10 +
'Message in full : ' + @13LineFeed10 +
CAST(@msg_body AS VARCHAR(MAX))

EXEC msdb.dbo.sp_send_dbmail @profile_name = 'MailProfile',
@recipients = N'myemail@myaddress',
@subject = 'SQL Blocking Report',
@body = @emailBody

UPDATE tbl_messages
SET Emailed = 'Y'
WHERE xactid = @xactid
END

FETCH NEXT FROM email_cur INTO @msg_body, @xactid
END

CLOSE email_cur
DEALLOCATE email_cur


------------------------ END OF PROCEDURE --------------------------------


OK, I've used a SQL cursor but it's on a small memory table so it isn't an issue.

This stored procedure is in a job which executes every couple of minutes (uses WAITFOR DELAY 00:01 to loop continuously). There is another job which clears the history table down every week.

This is now happily whirring away waiting for the blocking process. This may need (unfortunately) expanding to run a SQL Trace also, so that the blocking process can be investigated further. But that's another blog post !

Thursday, 17 December 2009

TFS merge filename collision error

The scenario is that I have a "Main" repository, this is branched to the "development" tree. Amendments are applied to the "development" tree, and when released this is merged back to the "main" repository. It was this merge back/check in that was giving the error. Seeing as there had been no changes to filenames and locations (there were some new files though), the collision error is a little strange.

Anyhow, to sort this out I force got latest for the "main" repository, overwriting all the files. Then I merged the "development" tree back, and check in pending changes then worked with no error. All via the GUI.

There are a number of non-gui based solutions by performing a baseless merge. I'm not sure of the side-effects of doing this, the solution described above seems to retain all history as expected.

Look here for details on a baseless merge.

Tuesday, 24 November 2009

SSMS Tool Pack

A brilliant addon for SSMS 2005/2008, and it's free ! Features include:

•SQL Snippets
•Window Connection Coloring
•Query Execution History and Current Window History
•Format SQL
•Search Table, View or Database Data
•Run one script on multiple databases
•Copy execution plan bitmaps to clipboard or file
•Search Results in Grid Mode or Execution Plans
•Generate Insert statements from resultsets, tables or database
•Regions and Debug sections
•Running custom scripts from Object Explorer
•CRUD stored procedure generation
•New query template
•General options

Install now, if only for the colour coded connection windows !!

Tuesday, 3 November 2009

SQL Server Memory Settings 64bit

Rather interesting blog posting here talking about SQL Server 2005/2008 64bit and memory settings. The recommendation is to always set these as the following table outlines:

Physical RAMMaxServerMem Setting
2GB1500
4GB3200
6GB4800
8GB6400
12GB10000
16GB13500
24GB21500
32GB29000
48GB44000
64GB60000
72GB68000
96GB92000
128GB124000


What makes this interesting is the comments posted, where one poster mentions that they have seen instability when the memory is not set.

Tuesday, 27 October 2009

Run SQL Trace in background

Rather useful this if you need to monitor a trace for a period of time and can not leave the session logged in. Found how to do it here.

First use Profiler to define the events, columns, and filters needed. Some Events are : SQL:BatchCompleted and RPC:Completed, SP:StmtCompleted. Important columns are : Duration, CPU, Reads and Writes. Some advanced events are SP:Recompile and Scan:Started to check for table and index scans

Click the Run button. Immediately stop the trace

Click the File menu, expand the Export option, and then expand the Script Trace Definition option. Choose For SQL Server 2005 (or SQL 2000 if creating script for older SQL Server) and select a filename to save the script.

Once the script has been saved, open it for editing in SQL Server Management Studio.

The following line of the script must be edited, and a valid path must be specified, including a filename:
exec @rc = sp_trace_create @TraceID output, 0, N'InsertFileNameHere',
@maxfilesize, NULL. The @maxfilesize is 5MB by default

Run the script. The generated script will also select back a @traceID

Once you are done use the @traceID to stop and close the trace:

EXEC sp_trace_setstatus @traceid=99, @status=0
EXEC sp_trace_setstatus @traceid=99, @status=2

The fn_trace_gettable function can be used to read the data from the trace file :

SELECT * FROM ::fn_trace_gettable('C:\Traces\myTrace.trc', 999) where 999 is the number of rollover trace files to read


Easy !