Monday 8 December 2008

Dynamic PIVOT table

SQL 2005 Pivot command is great if you have a fixed column result set. However, if the columns are data derived then it is unfortunately not possible to use a SELECT in the PIVOT IN clause. The following overcomes this shortfall :

--==============================
-- Create pivot "columns"
--==============================

SELECT @cols = STUFF(( SELECT DISTINCT TOP 100 PERCENT
'],[' + t2.Name
FROM ColumnNamesTable AS t2
ORDER BY '],[' + t2.Name
FOR XML PATH('')
), 1, 2, '') + ']'

--==============================
-- Create SQL for Pivot table
--==============================
SET @sqlStmt = N'SELECT MainColumn, ' + @Cols +
' FROM ResultsTable' +
' PIVOT'+
' (SUM(Value) FOR ColumnName IN (' + @cols + ')) AS PivotedResults' +
' ORDER BY MainColumn'

EXEC sp_executeSql @sqlStmt

Friday 17 October 2008

Memory check which tables are loaded into memory

Great tip here for analysing whether you are over stretching your memory in SQL. The root of the issue is that clustered indexes are loaded into memory (the whole table is loaded) in order to query. Should your table be of a significant size, then all of your memory (or more) may be required to load a primary index in for querying. The solution to this is to remove the clustered index - this may equate to higher disk i/o however since the size of the index is smaller it may be able to load into memory anyhow.

A check for table memory usage is as follows:

SELECT sys.tables.name TableName,
sum(a.page_id)*8 AS MemorySpaceKB,
SUM(sys.allocation_units.data_pages)*8 AS StorageSpaceKB,
CASE WHEN SUM(sys.allocation_units.data_pages) <> 0 THEN
SUM(a.page_id)/CAST(SUM(sys.allocation_units.data_pages) AS NUMERIC(18,2))
END AS 'Percentage Of Object In Memory'
FROM (SELECT database_id, allocation_unit_id, COUNT(page_id) page_id FROM
sys.dm_os_buffer_descriptors GROUP BY database_id, allocation_unit_id) a
JOIN sys.allocation_units ON a.allocation_unit_id =
sys.allocation_units.allocation_unit_id
JOIN sys.partitions ON (sys.allocation_units.type IN (1,3)
AND sys.allocation_units.container_id = sys.partitions.hobt_id)
OR (sys.allocation_units.type = 2 AND sys.allocation_units.container_id
= sys.partitions.partition_id)
JOIN sys.tables ON sys.partitions.object_id = sys.tables.object_id
AND sys.tables.is_ms_shipped = 0
WHERE a.database_id = DB_ID()
GROUP BY sys.tables.name

For further information on this and perfmon settings, look here

Tuesday 14 October 2008

TFS Event Subscription

Finally got round to looking into this. Firstly, the next iteration of the 2008 Power Tools will introduce a nicer interface for the maintenance of event subscriptions. However, that is not due quite yet so in the meantime there are two tools to fill the gap.

The TfsAlert tool provides a system tray app that balloon tips whenever events change. It monitors 8 different events, and allows filtering. The interface for filtering is to edit the config file, so knowledge of syntax is required. However, that syntax can be built by the next tool.

The second tool is TFS Event Subscription and is a GUI builder for event subscription strings. This allows you to subscribe to a particular event and get a message sent by SOAP, PlainText or EmailHTML. This is a standard feature of TFS, however this interface can be run on a client and is a great deal easier to use compared to command line.

Lastly, this blog provides details on how to amend the emails sent by TFS to point to the TFS Web Access for the work item, rather than a summary report.

Tuesday 2 September 2008

Clustered SQL Server error 28000 - cluster service login error

Whilst trial testing a failover for a clustered (W2K3) SQL Server 2005, we experienced an error 28000, failing on authenticating the cluster service account. This meant that the server service did not restart under cluster manager (though it is possible to start it from command prompt or via services).

This was due to removing the BUILTIN\Administrators group from security for SQL Server. Although this is standard practice, it appears that SQL Server is unable to verify the cluster serice account via an AD group. The solution to this is to either add back the BUILTIN\Administrators account (not desired) or create a security entry for the cluster service account.

Monday 18 August 2008

SQL 2005 replication history

One of the shortcomings of the new replication monitor is it's lack of history. SQL 2000 was a lot better at this IMO. So, to get this information you need to create a report etc using SQL along these lines:

@subscriber NVARCHAR(256) = NULL
, @dateFrom DATETIME
, @dateTo DATETIME

SELECT ms.publisher
, ms.publisher_db
, ms.subscriber
, ms.subscriber_db
, mh.comments
, mh.[time]
FROM dbo.MSmerge_subscriptions ms
INNER JOIN dbo.MSmerge_agents ma
ON ms.subscriber = ma.subscriber_name
AND ms.subscriber_db = ma.subscriber_db
INNER JOIN dbo.MSmerge_sessions mse
ON ma.id = mse.agent_id
INNER JOIN dbo.MSmerge_history mh
ON mse.session_id = mh.session_id
AND mse.agent_id = mh.agent_id
WHERE ms.subscription_type = 0
AND (ms.subscriber + ':' + ms.subscriber_db = @subscriber OR @subscriber IS NULL)
AND mh.[time] BETWEEN @dateFrom AND @dateTo


To get the subscribers :

SELECT DISTINCT subscriber + ':' + subscriber_db AS subscriber
FROM dbo.MSmerge_subscriptions
WHERE subscription_type = 0

Skip SQL Statement when user does not have permission or linked server missing

Had an interesting problem with a stored procedure that checked replication history. This SP included a local and remote server using linked server. Occasionally the link would drop, and also whenever replication was rebuilt the user would forget to set up the permissions to the re-generated distribution database. Either of these situations throw errors in the SQL, so that it doesn't complete. To save redoing the existing reports, I needed a solution whereby the remote sql is skipped if it doesn't work. The solution seemed to be to use TRY..CATCH, however there is an issue with this. TRY..CATCH does not work with errors produced over a linked server. After some head scratching it was found that this could be overcome using EXEC sp_executesql. Since the SQL will not be in a plan (it is, afterall, on a different server) then there is no performance issue here either. So, the SQL read along the lines of :

BEGIN TRY
SET @sqlstmt = 'SELECT xxx FROM remoteserver.database.dbo.yyy'
EXEC sp_executesql
END TRY
BEGIN CATCH
--ignore error from remote system
END CATCH

Friday 25 July 2008

Reporting Services Performance Monitoring

Useful technet article on how to monitor SSRS performance report by report.

http://technet.microsoft.com/en-us/library/aa964131.aspx

Friday 27 June 2008

Merge replication error Msg 20092, Level 16, State 1

Recently I created a (fortunately) test environment in SQL 2005 for a merge replication that was to upgrade an existing SQL 2000 installation. I didn't want to re-initialise the subscriber as there was a large amount of data involved (about 115million rows). Once the subscriber was created (and snapshot created), I tested the merge by updating a particular set of data on the publisher first and then the subscriber to check that the replication worked. Publisher update went fine, data replicated successfully. However, when I tried to update the rows at the subscriber I got an error Msg 20092, Level 16, State 1 Procedure MSmerge_disabledml....

Table 'XXXXXX' into which you are trying to insert, update, or delete data is currently being upgraded or initialized for merge replication. On the publisher data modifications are disallowed until the upgrade completes and snapshot has successfully run. On subscriber data modifications are disallowed until the upgrade completes or the initial snapshot has been successfully applied and it has synchronized with the publisher.

Msg 3609, Level 16, State 1, Line 1

The transaction ended in the trigger. The batch has been aborted.



Fortunately the solution for this is quite straight forward. It seems that somehow during the creation of the subscriber (and I admit this was done a few times) the merge replication upgrade triggers were not removed.

To correct this scenario, you must run the following SQL on the subscriber, drop the subscription and re-create the subscription again. The script is (OK I know it uses a cursor but it's a one off fix script so who cares!):

DECLARE @sql NVARCHAR(4000),
@trigger NVARCHAR(100)

DECLARE trig_curs CURSOR for
SELECT NAME
FROM sys.triggers
WHERE name LIKE 'MSmerge%' AND PARENT_CLASS=1

OPEN trig_curs

FETCH NEXT FROM trig_curs INTO @trigger

WHILE @@FETCH_STATUS = 0
BEGIN

set @sql = 'drop trigger [' + @trigger + ']'

print @sql

exec sp_executesql @sql

FETCH NEXT FROM trig_curs INTO @trigger
END

CLOSE trig_curs
DEALLOCATE trig_curs

Friday 30 May 2008

Deadlocks - solving

Great blog here on how to investigate the SQL causing deadlocks. Pretty comprehensive, clear steps concentrating on the information presented by SQL 2005.

Wednesday 7 May 2008

Data Table Shrinking

One the the most annoying features of SQL Server is that the GUI based shrink commands rarely work. In a development environment we do not always have the space to store a database and it's predicted growth, so shrinking is a necessity.

The following script will shrink all the files in a database - data and log, and free up that much needed space. Just edit it and replace the logical file names. It will shrink any number of files - just add/remove the commands as required.

use master
go
BACKUP LOG [BSE] with TRUNCATE_ONLY
GO
DBCC SHRINKDATABASE(N'BSE')
GO
USE BSE
GO
DBCC SHRINKFILE(BSE_dat, 10)
GO
DBCC SHRINKFILE(BSE_dat2, 10)
GO
DBCC SHRINKFILE(BSE_dat3, 10)
GO
DBCC SHRINKFILE(BSE_dat4, 10)
GO
DBCC SHRINKFILE(BSE_Log, 3)
GO

Tally Table - loop replacement

Excellent article here on using Tally Tables instead of a loop. The performance increases are amazing. In essence the Tally Table is used to replace the loop, and using set based logic rather than a traditional loop gives high performance gains. Worth a read at least.

Wednesday 23 April 2008

SQL String Formatting

Very nice set of functions to format strings in SQL. Have a look here for details

Back from hols

I've been a little quiet for a few weeks or so, mostly due to being on vacation. A good trip to Florida to recharge etc, back to the grind now and hopefully keep this up to date aswell

Wednesday 12 March 2008

Tips for upgrading TFS 2005 to TFS 2008

I found this site to be very useful. I followed the instructions and they were pretty much on the money. I had a couple of minor issues - mostly to do with domain changes so beyond the scope of Grant's instructions.

So success in upgrading to TFS 2008 and WSS 3.0 - and was relatively painless aswell.

The main things I would recommend when doing this are :

1. Test the disaster recover - an upgrade is as good an excuse as any (use a standby server).
2. Trial run the upgrade on a copy standby server.
3. Take a disk image of your server before you start - that way if it all goes v.bad you can recover easily.
4. Be prepared for things to go wrong - the pre scans and best practices are fairly strict.
5. Change the Share Point Administration shortcut in admin tools to point to the new port.

Upgrade TFS 2005 WSS2.0 to TFS 2008 WSS3.0

Stolen from here are the steps to upgrade WSS2.0 to WSS3.0 when upgrading TFS 2005 to TFS 2008. Worked well for me!


Run Orcas TFS installer to upgrade your VS2005 TFS server. At end of this process you should have Orcas server with WSS2.0 configured.

Next Run WSS prescan.exe tool to verify if you can upgrade WSS from 2.0 to 3.0

Run WSS3.0 Installer and select the upgrade option

The installer will ask you for new Administration Port. You cannot use the old port(17012). You need to select new port number.

Complete all the installation and upgrade steps for WSS3.0

Now you should have Orcas server with WSS3.0. You still cannot use the system as you still need to tell TFS about the new URLs for WSS and also upload the new site templates for WSS3.0

Run Share Point Extensions for TFS setup. This should be located at InstallMedia\WssExt directory. This will upload the new site templates for WSS3.0.
The upgrade changes the WSS admin site URL. To make this change in TFS you can use TFSAdminUtil . Following is the command you can use.

TFSAdminUtil configureconnections /SharepointAdminUri:http://wssserver:adminport

Note: You can use TFSAdminUtil to update all the different URL's for WSS. Following is the syntax for updating all WSS related attributes in TFS.

TFSAdminUtil configureconnections
/SharepointURI:http://wssserver:80
/SharepointSitesUri:http://wssserver:80/sites
/SharepointAdminUri:http://wssserver:adminport
/SharepointUnc:\\wssserver\sites


Finally - make sure you change the shortcut to SharePoint Central Administration in the Administration Tools to point to the new port !!

I had a problem with the Central Administration reporting Service Unavailable.
This was due to an incorrect login in the application pool (IIS - Sharepoint Central Administration v3), which was fairly easy to find.

Excel interop with C# memory leak - Sheet.Rows.Count and Sheet.Columns.Count

Investigations of the horrendous memory leak in my Office 2007 Ribbon application led me (by trial and error tbh) to an issue with the Interop Sheet.Rows.Count. I was using this - naively - to count the number of rows on a particular sheet, and was then using that figure to create a range which I cleared.

The only problem is that Sheet.Rows.Count and Sheet.Columns.Count return the maximum possible Rows/Columns on a sheet even if they are not populated with data. This is not what I wanted to do. I wanted to know the dimensions of the populated area so that I could blank it.

Coupled to this is that Range.Clear() seems to retain memory for a long period of time - this may be due however to the size of the range created by the eroneous returns of Sheet.Rows.Count/Sheet.Columns.Count.

I am now in the process of finding a different way to do this, and may have a possible solution. I will post it up when I have confirmed it.

Tuesday 11 March 2008

.NET Memory Usage

Had an issue with a user complaining that Excel uses too much memory in task manager, so I investigated what the usage really is.

The first point to note is that Task Manager does not accurately report the memory used by an application. It shows the Working Set - that is memory that has been "reserved" by Windows just in case. If another application needs the memory then it is re-allocated. As such we really need to measure memory usage using Performance Manager.

The main counters to add are from the Process object. Make sure you select your .NET application, and add the counters Private Bytes and Working Set. Private Bytes gives the true (ish) figure - however bear in mind that this may be swapped to disk so will not necessarily have an impact on your system.

The article that explains this further can be found here.

Monday 10 March 2008

Custom Report Items in SSRS 2005

I was recommended the following by Jon as a possible solution to a calendar painting issue that Owen has. SSRS can be extended using custom report items (CRI) which should allow us to do jsut about anything. Jazz Up Your Data Using Custom Report Items In SQL Server Reporting Services runs through an example of this and is worth a look.

Friday 7 March 2008

Diagnosing Replication Issues

I'm in the process of creating a Transaction Replication solution that incorporates some transformation. The condensed version is that the publisher data structure is nothing like the subscriber structure. To do this I have replaced the standard sp_MSins/del/upd stored procedures with my own coding. There is some simple transformation going on in these sp's, and debugging it is not nice.

So, imagine my delight when an article from SqlServerCentral dropped into my Inbox - couldn't have been better time.

The important bit for me was establishing where the problem lies and what to do with it. The way to do this is as follows:

In Replication Monitor examine the error message in the detail screen. It will have something like this :

Command attempted:
if @@trancount > 0 rollback tran
(Transaction sequence number: 0x0000018F0000008800A700000000, Command ID: 83)

Error messages:
etc.

The bit to note is the Transaction sequence number and Command ID. You can drop the trailing 0's from the sequence number.

Next SELECT FROM msdistribution_agents and get the publisher_database_id. Note this number, you'll need it.

Finally, call the procedure sp_browsereplcmds :

exec sp_browsereplcmds
@xact_seqno_start = '0x0000018F0000008800A7',
@xact_seqno_end = '0x0000018F0000008800A7',
@publisher_database_id = 15,
@command_id = 83

Examine the contents of the command column - this is what is being sent to your stored procedure. You can now run this by hand and see what the issue is. I recommend using a transaction and rolling back so that you do not introduce other errors.

Friday 29 February 2008

Upgrade TFS 2005 to 2008 : Error 28805 cannot finish the request to the SQL Server 2005 Reporting Service

Another problem I ran into at the end of the installation, and this one is not documented in the official installation instructions - it should be.

All the permissions checked out, SSRS was running, so this error is a little weird. Turns out it is due to the encryption on the Reporting Services. There are 2 ways round this :

1. Drop the SSRS databases.
2. Remove the encryption.

I went for the second option - didn't fancy losing whole databases. To remove the encryption, open the Reporting Services Configuration Manager, go to Encryption Keys and click Delete.

Forum post is here

Upgrade TFS 2005 to 2008 : Workflow 'Quiesce DT' failed! ExitCode = 8000

Whilst attempting to upgrade the test TFS 2005, I got a rather nasty error box appear. Examination of the logs was a touch difficult (wood for trees), but managed to spot the the 'Quiesce DT failed'. Further digging and I spotted "No associated service account for role TFSEXECROLE". Not entirely sure what this meant, I found this post which is exactly the same issue. The bottom line is that every TFS database must have the TFSService account assigned to the TFSEXECROLE database role. The following SQL helps identify this, run it against each TFS database until the incorrect one is found :

SELECT dpMember.name
FROM sys.database_principals dp
JOIN sys.database_role_members drm
ON drm.role_principal_id = dp.principal_id
JOIN sys.database_principals dpMember
ON dpMember.principal_id = drm.member_principal_id
WHERE dpMember.Type = 'U'
AND dp.name IN ('TFSEXECROLE')

The add TFSService to the role.

Thursday 28 February 2008

Query execution failed for data set 'IterationParam'. (rsErrorExecutingCommand)

Got this when restoring TFS to test the disaster recovery. The solution is as follows from MSDN forums:

This might be due to a known Analysis Services issue that can be fixed by applying the SQL Server 2005 KB914595 HotFix from http://msdn.microsoft.com/vstudio/teamsystem/downloads/

If this does not work, I would recommend deleting the cube and recreating the warehouse by following the following steps:

Step 1:
Stop the SQL Server Analysis Services – Use Management Studio - Connect to Analysis Services - Right click on the Analysis Server and select Stop.
Make a copy of the Analysis Server “data” folder for backup. Program Files\Microsoft SQL Server\MSSQL.2\OLAP\Data
Delete the Data Folder
Restart Analysis Services.

Step 2
Run the setupWarehouse command from the application tier:
C:\Program Files\Microsoft Visual Studio 2005 Team Foundation Server\Tools\Setupwarehouse.exe -o -mturl http://|APP_TIER|:8080 -c warehouseschema.xml -s |DATA_TIER| -d TfsWarehouse -a |DOMAIN|\|SERVICE_ACCOUNT| -ra |DOMAIN|\|REPORTING_ACCOUNT|

|APP_TIER| is the name of the app tier server.
|DATA_TIER| is the name of the data tier server
|DOMAIN|\|SERVICE_ACCOUNT| is the account used for TFS Services (set in your domain)
|DOMAIN|\|REPORTING_ACCOUNT| is the account used for TFS Reporting (set in your domain)

Wait for this process to complete.

Step 3:
From the application tier server
Open http://localhost:8080/Warehouse/v1.0/warehousecontroller.asmx.
Click Run and then click Invoke.
Open http://localhost:8080/Warehouse/v1.0/warehousecontroller.asmx.
Click GetWarehouseStatus, and then click Invoke.
Wait for the service to idle. The service should return Running Adapters, ProcessingOlap, and then idle. The four possible return states are:
Idle: The warehouse is not processing.
RunningAdapters: The adapters are applying schema changes or pulling data into the warehouse.
ProcessingOlap: The warehouse is processing schema changes or data changes in the OLAP cube.
Blocked: The warehouse is blocked.

Step 4:
Process the TFSwarehouse in Analysis Service using Management Studio.


Credit to Othmane Rahmouni-MSFT
Forum Post : Warehouse Cube Won't Process - B3Refresh Upgraded OK

**Further findings**

If the above does not solve your problem, then it could be due to simple permissions on the datasources.

To verify this, open any report in the TFS portal. If you are not presented with the normal params i.e. it asks for a login and password, or if it complains about not being to connect to the TFSOlapReportDS then this may be due to security.

By default, SSRS installs using "Credentials Supplied by the User Running the Report". This does not seem (in my case) to work with TFS. To correct this issue you must :

1. Go to site settings.
2. Select the TFSOlapReportsDS Datasource
3. Create (or check) the ConnectionString. It should be along the lines of : Data source=TFS_SERVER;initial catalog=TFSWarehouse
4. Change the Connect Using to use Credentials stored securely in the report server. Set this to use your fixed TFSReports login and password.
5. Check Use as Windows credentials when connecting to the data source
6. Repeat this for TfsReportDS

All should work now !

Friday 22 February 2008

Merge replication CPU usage goes to 100%

There is an issue with merge replication whereby the MSmerge_contents and MSmerge_history do not tally. If this occurs then the only solution (so far) is to drop and recreate the database and replication.

The work-around to prevent this happening is :

update sysmergepublications set [generation_leveling_threshold] = 0

This stops generations being created in the first place, so the issue is avoided.

Parameter Sniffing - SQL executes faster as a query than as a stored procedure

This strangely monickered problem caught me out last week.

The scenario is that you have a piece of SQL that runs quicker as SQL executed in query analyzer than as a stored procedure. The reason for this is a little obtuse, and worth remembering.

It seems that SQL Stored Procedure Plans can have issues with parameter variables. This affects both SQL Server 2005 and 2000. The official line is :

"Parameter sniffing" refers to a process whereby SQL Server's execution environment "sniffs" the current parameter values during compilation or recompilation, and passes it along to the query optimizer so that they can be used to generate potentially faster query execution plans. The word "current" refers to the parameter values present in the statement call that caused a compilation or a recompilation"

See Batch Compilation, Recompilation, and Plan Caching Issues in SQL Server 2005


What this means in practice is that the stored procedure plan may be optimized for a particular result set volume, and when the stored procedure is executed with different parameters the plan is no longer efficient.

There are a few things that can be done to circumvent this issue :

1. Only use local variables in your procedure, and assign them to parameter variables at the start. This forces SQL to create a generic plan.
2. Call the stored procedure WITH RECOMPILE each time. This will force a new plan to be created which may be quicker than using the existing optimized plan.
3. Move statements into sub stored procedures - the plans will be smaller and more reliable.
4. Encode any statements that reference the variables into string variables and sp_executesql them.

In the case that I had I reduced a heavy stored procedure from over 4 minutes to about 30 seconds. However, in my case only items 3 and 4 made any descernable difference.

It's a wonder this doesn't happen more often - it seems to be a strange catch out that only appears in certain circumstances.

For a detailed example of this look on Omnibuzz's blog

Thursday 21 February 2008

Deploying Office Tools SE with VS 2005 / Creating Custom Bootstrap

I'm in the process of working out how to include Office 2007 PIA/VSTO redistributable packages with my application (Office 2007 Ribbon app). Unfortunately these packages do not appear in the Pre-requisites is VS 2005.

As part of my investigations I stumbled on a nice Custom Bootstrap Manifest Generator written by David Guyer. Looks to be very useful, yet to use but worth bookmarking.

Find it here : Bootstrapper Manifest Generator

Also, I have found a nice article on how to deploy Office 2003/2007 solutions using Windows Installer. It's a 2 part walkthrough.

Of major interest is the Package.xml file for PIA (2003/2007).

I've run through this and it all works - though not exactly how I wanted it to. You have to execute the setup.exe in order to install the .Net 2.0, VSTO and PIA. In is interactive, requiring some clicks on next buttons. I really wanted it to be silent, but I guess as this only needs to be run the once then it is OK. The MSI can be executed silently thereafter to install updates so it's good enough.

Wednesday 20 February 2008

Profiler and Perfmon

Found this article about how to combine Profiler and Perfmon - very useful for analysing performance of particular SQL activities.

Integrating Profiler and Perfmon Log Files

SQL Server 2005 : Procedure Cache memory usage keeps growing

We have had an issue with a clustered instance of SQL-Server 2005 whereby it would run out of memory. Several things were tried but didn't seem to make any difference.

The server in question was RTM, and due to changes in infrastructure could not be upgraded to SP2. Without it being SP2 we were not able to properly ascertain the issue.

A new installation was created and updated to SP2 (3050) and the majority of the applications were moved.

We have been monitoring the memory and so far the SP2 seems to have sorted the issue. Further investigation has highlighted an issue with SQL 2005 RTM whereby the procedure cache is not properly flushed. Our investigation seem to agree with this issue. The monitoring needs to over a few days but so far it is looking OK.

The SQL used to monitor the cache is as follows, we also used Idera to monitor visually :


-- Get the size of the Plan Cache (CACHESTORE_SQLCP is non-SP and CACHESTORE_OBJCP is SP)
SELECT (SUM(single_pages_kb) + SUM(multi_pages_kb) ) / (1024.0 * 1024.0) AS 'Plan Cache Size(GB)'
FROM sys.dm_os_memory_cache_counters
WHERE type = 'CACHESTORE_SQLCP'
OR type = 'CACHESTORE_OBJCP'

-- Top 10 consumers of memory from Buffer Pool
SELECT TOP 10 type, sum(single_pages_kb) AS [SPA Mem, Kb]
FROM sys.dm_os_memory_clerks
GROUP BY type
ORDER BY SUM(single_pages_kb) DESC

-- UseCounts and # of plans for Adhoc plans
SELECT usecounts, count(*) as no_of_plans
FROM sys.dm_Exec_Cached_plans
WHERE cacheobjtype = 'Compiled Plan'
AND objtype = 'Adhoc'
GROUP BY usecounts
ORDER BY usecounts

-- Find the ad-hoc queries that are bloating the plan cache
SELECT TOP(100) *
FROM sys.dm_Exec_cached_plans
CROSS APPLY sys.dm_exec_sql_text(plan_handle)
WHERE cacheobjtype = 'Compiled Plan'
AND objtype = 'Adhoc' AND usecounts = 1
--AND size_in_bytes < 200000
ORDER BY size_in_bytes DESC

Tuesday 19 February 2008

TFS : The permissions granted to user are insufficient for performing this operation. (rsAccessDenied)

I got the error message :
The permissions granted to user are insufficient for performing this operation. (rsAccessDenied) on a user today when running the reports from Team Foundation Server.

This was an existing user who had not experienced this particular error before. I had just created a new project, and had assigned the AD user group (using the excellent Pwoer Tool - Team Foundation Server Administration Tool) as a contributor/publisher. I even created them as a standalone user and still no joy.

After some web crawling I found that there is an interesting anomoly in Reporting Services. The role "Publisher" by default does not have the ability to view reports. OK....

To resolve the issue :

1. Open SSMS and connect to the Reporting Services instance.
2. Plus open security..roles
3. Open the Properties window for the Publisher role and check :
View Reports
View Resources
View Folders
View Models
4. Click OK.

For more information see :
Lesson 1: Setting System-level Permissions on a Report Server
and
Lesson 2: Setting Item-level Permissions on a Report Server

In the beginning...

I blame Owen for this. I've always kept notes but never centralised them, so this blog is an attempt at recording snippets of (hopefully) useful information that I have stumbled on as part of my working life. Some of it may be useful to others, but mostly it will be useful for me to refer to as my ability to retain information long term aint what it used to be.



Enjoy :)