Thursday 30 July 2009

Bootstrapping .NET 3.5 SP1 with Visual Studio 2008

This is actually fairly easy, EXCEPT for a nasty gotcha in the MS help documentation.

The instructions are here. In summary :

1. Download dotnetfx35 from here

2. Run dotnetfx35.exe /x

3. Copy files to C:\Program Files\Microsoft SDKs\Windows\v6.0A\Bootstrapper\Packages\DotNetFX35SP1

4. Edit the product.xml as per the instructions:

1. Open the [Program Files]\Microsoft SDKs\Windows\v6.0A\Bootstrapper\Packages\DotNetFx35SP1 folder or %ProgramFiles(x86)%\Microsoft SDKs\Windows\v6.0A\Bootstrapper\Packages\DotNetFx35SP1 on x64 operating systems
2. Edit the Product.xml file in Notepad.
3. Paste the following into the < PackageFiles > element:



4. Find the element for < PackageFile Name="dotNetFX30\XPSEPSC-x86-en-US.exe" and change the PublicKey value to: 3082010A0282010100A2DB0A8DCFC2C1499BCDAA3A34AD23596BDB6CBE2122B794C8EAAEBFC6D526C232118BBCDA5D2CFB36561E152BAE8F0DDD14A36E284C7F163F41AC8D40B146880DD98194AD9706D05744765CEAF1FC0EE27F74A333CB74E5EFE361A17E03B745FFD53E12D5B0CA5E0DD07BF2B7130DFC606A2885758CB7ADBC85E817B490BEF516B6625DED11DF3AEE215B8BAF8073C345E3958977609BE7AD77C1378D33142F13DB62C9AE1AA94F9867ADD420393071E08D6746E2C61CF40D5074412FE805246A216B49B092C4B239C742A56D5C184AAB8FD78E833E780A47D8A4B28423C3E2F27B66B14A74BD26414B9C6114604E30C882F3D00B707CEE554D77D2085576810203010001
5. Find the element for < PackageFile Name="dotNetFX30\XPSEPSC-amd64-en-US.exe" and change the PublicKey value to the same as in step 4 above
6. Save the product.xml file


This is where the gotcha is. In the original instructions, there is a space before each PackageFile keyword, making it malformed. VS 2008 will then not recognise the product correctly. Just make sure that space is not there and it will all work.

Wednesday 29 July 2009

Using SQL Server Application Roles through C#

I've not had to do this with C# before (did it with Delphi no problem), and found that there is an issue using connection pooling.

The problem is that the connection seems to be in error once a role has been set. The test I performed was as follows :

1. Set approle
2. call a stored procedure to populate a datatable from c#
3. call the stored procedure again.

On the second call to the stored procedure, I got an error whatever I did. If you do not set the approle, you will get a permission denied error. If you set the approle then you get a nasty looking error telling you the connection needs to be reset. Either way doesn't work.

As I am reluctant to use a dedicated connection (for a start this would involve rewriting alot of code, not to mention any performance issues), I needed a proper solution. And here it is :

1. Open connection.
2. exec sp_setapprole
3. run sql
4. exec sp_unsetapprole
5. close connection.

The only tricky bit is that you have to get the @cookie from sp_setapprole and pass it into sp_unsetapprole to free it up. This is a varbinary, so needs to be handled in a byte[] type.

So, pseudo-ish code is

public DataSet getSomeData()
{
connection = getConnection();
connection.Open();
byte[] cookie = SetApprole(connection, approle, approlepassword);
....
call some sql using connection
....

UnsetApprole(connection, cookie);
}

public byte[] SetApprole(SqlConnection connection, string approle, string approlePassword)
{
StringBuilder sqlText = new StringBuilder();

sqlText.Append("DECLARE @cookie varbinary(8000);");
sqlText.Append("exec sp_setapprole @rolename = '" + approle + "', @password = '" + approlePassword + "'");
sqlText.Append(",@fCreateCookie = true, @cookie = @cookie OUTPUT;");
sqlText.Append(" SELECT @cookie");

if (connection.State.Equals(ConnectionState.Closed))
connection.Open();

using (SqlCommand cmd = connection.CreateCommand())
{
cmd.CommandType = CommandType.Text;
cmd.CommandText = sqlText.ToString();
cmd.CommandTimeout = commandTimeout;
return (byte[])cmd.ExecuteScalar();
}
}

And the final trick - calling sp_unsetapprole. This gave me issues as it errored with a 15422 (Application roles can only be activated at the ad hoc level) on a CommandType.Text execution. It has to be a CommandType.StoredProcedure to work :

public void UnsetApprole(SqlConnection connection, byte[] approleCookie)
{
string sqlText = "exec sp_unsetapprole @cookie=@approleCookie";

if (connection.State.Equals(ConnectionState.Closed))
connection.Open();

using (SqlCommand cmd = connection.CreateCommand())
{
cmd.CommandType = CommandType.StoredProcedure;
cmd.CommandText = "sp_unsetapprole";
cmd.Parameters.AddWithValue("@cookie", approleCookie);

cmd.CommandTimeout = commandTimeout;
cmd.ExecuteNonQuery();
}
}

Remember to add all your favourite try...catch blocks etc.

Tuesday 21 July 2009

Deadlock resolution

An excellent post here : A Deadlock Resoultion - on how to track and trace and solve a deadlock issue.

SQL Memory Usage - Further Research into the Procedure Cache

Following on from the last post on SQL Server Memory usage, I was particularly interested in the large use of the procedure cache by ad-hoc queries. In the example I have here, there is a large number of ad-hoc queries, and the cache is constantly growing.

This is a consistent behaviour, whenever a statement is executed, SQL Server will search through the cache to see if the statement has been used before. As this is an ad-hoc statement, it is unlikely to find one so will create a new plan and add it to the cache. These are not aged all that quickly, so the effect is that the cache keeps on growing, which in turn will slow down the process of locating a plan. This then means that SQL Server (if there is a demand on memory) may flush pages from data cache, which will also have an effect on the performance.

The preferred solution is to use parameterised stored procedures instead of ad-hoc queries, however if this is not possible (amending the application may be beyond your control) then flushing the stored procedure cache is the next best thing.

Ideally, we do not want to flush the whole cache, as genuine reuseable plans are a good thing. Clearing the ad-hoc plans is therefor preferred. This is not obvious from BOL, however I stumbled on the following statement on a google trawl :

DBCC FREESYSTEMCACHE('SQL Plans')

This frees just the ad-hoc plans - perfect!

It goes without saying that this needs to be tested, with proper performance figures to back it up. If you are short of memory, and/or are witnessing a slowdown through time on general SQL activity then this may be one of the solutions.

Wednesday 15 July 2009

SQL Memory Usage analysis

Analysing whether there is enough memory on a SQL Server is no easy task. Looking at task manager is of no real use, the only way to get meaningful figures is to use Performance. Here, knowing which metrics to record and how to interpret them is the complexity. This also needs to be combined with information that is provided by SQL Server.

This is not a comprehensive guide on what to do, see websites below for more detailed information. This is post is a starting point to hopefully get some figures up and running without too much investigation.

The main metrics to record are :

Memory: Available Bytes, Page Input/Sec, Pages/sec
Paging file: % Usage
Process: Page File Bytes Peak, Private Bytes, Working Set
SQLServer Memory Manager: Total Server Memory(KB)

Using these metrics, the page file can be calculated by taking the Page File Bytes Peak and * 0.7.
Page file usage can then be compared to the following thresholds:

Memory\\Available Bytes No less than 4 MB
Memory\\Pages Input/sec No more than 10 pages
Paging File\\% Usage No more than 70 percent
Paging File\\% Usage Peak No more than 70 percent

In addition, the working set can be compared to the maximum available bytes, and in turn the actual physical RAM on the system. If the physical RAM (after other processes are taken into account) is exceeded, then there is not enough memory.

If the working set is less that the Total Server Memory, then this is evidence that it is being trimmed.

To see if the swap file is being used, the following metrics need to be recorded:

Memory – Available Mbytes
Memory – Committed Bytes
Memory – Pages Output/Sec
Paging - %Usage
Working Set – Sqlservr

Of real interest, if the Memory Pages Output/Sec is not 0, then the swap file is being used.

To analyse what the memory usage is, the DBCC MemoryStatus can help to identify how efficiently the procedure cache is used. A large procedure cache may have an impact on SQL performance, and may result in less data being kept in memory force SQL to go to disk.

Useful sites for further information are :

(for 64bit SQL) - How to reduce paging of buffer pool memory in the 64-bit version of SQL Server
How to reduce paging of buffer pool memory in the 64-bit version of SQL Server
Procedure Cache Tuning/Sizing
RAM, Virtual Memory, Pagefile and all that stuff