Friday 29 February 2008

Upgrade TFS 2005 to 2008 : Error 28805 cannot finish the request to the SQL Server 2005 Reporting Service

Another problem I ran into at the end of the installation, and this one is not documented in the official installation instructions - it should be.

All the permissions checked out, SSRS was running, so this error is a little weird. Turns out it is due to the encryption on the Reporting Services. There are 2 ways round this :

1. Drop the SSRS databases.
2. Remove the encryption.

I went for the second option - didn't fancy losing whole databases. To remove the encryption, open the Reporting Services Configuration Manager, go to Encryption Keys and click Delete.

Forum post is here

Upgrade TFS 2005 to 2008 : Workflow 'Quiesce DT' failed! ExitCode = 8000

Whilst attempting to upgrade the test TFS 2005, I got a rather nasty error box appear. Examination of the logs was a touch difficult (wood for trees), but managed to spot the the 'Quiesce DT failed'. Further digging and I spotted "No associated service account for role TFSEXECROLE". Not entirely sure what this meant, I found this post which is exactly the same issue. The bottom line is that every TFS database must have the TFSService account assigned to the TFSEXECROLE database role. The following SQL helps identify this, run it against each TFS database until the incorrect one is found :

SELECT dpMember.name
FROM sys.database_principals dp
JOIN sys.database_role_members drm
ON drm.role_principal_id = dp.principal_id
JOIN sys.database_principals dpMember
ON dpMember.principal_id = drm.member_principal_id
WHERE dpMember.Type = 'U'
AND dp.name IN ('TFSEXECROLE')

The add TFSService to the role.

Thursday 28 February 2008

Query execution failed for data set 'IterationParam'. (rsErrorExecutingCommand)

Got this when restoring TFS to test the disaster recovery. The solution is as follows from MSDN forums:

This might be due to a known Analysis Services issue that can be fixed by applying the SQL Server 2005 KB914595 HotFix from http://msdn.microsoft.com/vstudio/teamsystem/downloads/

If this does not work, I would recommend deleting the cube and recreating the warehouse by following the following steps:

Step 1:
Stop the SQL Server Analysis Services – Use Management Studio - Connect to Analysis Services - Right click on the Analysis Server and select Stop.
Make a copy of the Analysis Server “data” folder for backup. Program Files\Microsoft SQL Server\MSSQL.2\OLAP\Data
Delete the Data Folder
Restart Analysis Services.

Step 2
Run the setupWarehouse command from the application tier:
C:\Program Files\Microsoft Visual Studio 2005 Team Foundation Server\Tools\Setupwarehouse.exe -o -mturl http://|APP_TIER|:8080 -c warehouseschema.xml -s |DATA_TIER| -d TfsWarehouse -a |DOMAIN|\|SERVICE_ACCOUNT| -ra |DOMAIN|\|REPORTING_ACCOUNT|

|APP_TIER| is the name of the app tier server.
|DATA_TIER| is the name of the data tier server
|DOMAIN|\|SERVICE_ACCOUNT| is the account used for TFS Services (set in your domain)
|DOMAIN|\|REPORTING_ACCOUNT| is the account used for TFS Reporting (set in your domain)

Wait for this process to complete.

Step 3:
From the application tier server
Open http://localhost:8080/Warehouse/v1.0/warehousecontroller.asmx.
Click Run and then click Invoke.
Open http://localhost:8080/Warehouse/v1.0/warehousecontroller.asmx.
Click GetWarehouseStatus, and then click Invoke.
Wait for the service to idle. The service should return Running Adapters, ProcessingOlap, and then idle. The four possible return states are:
Idle: The warehouse is not processing.
RunningAdapters: The adapters are applying schema changes or pulling data into the warehouse.
ProcessingOlap: The warehouse is processing schema changes or data changes in the OLAP cube.
Blocked: The warehouse is blocked.

Step 4:
Process the TFSwarehouse in Analysis Service using Management Studio.


Credit to Othmane Rahmouni-MSFT
Forum Post : Warehouse Cube Won't Process - B3Refresh Upgraded OK

**Further findings**

If the above does not solve your problem, then it could be due to simple permissions on the datasources.

To verify this, open any report in the TFS portal. If you are not presented with the normal params i.e. it asks for a login and password, or if it complains about not being to connect to the TFSOlapReportDS then this may be due to security.

By default, SSRS installs using "Credentials Supplied by the User Running the Report". This does not seem (in my case) to work with TFS. To correct this issue you must :

1. Go to site settings.
2. Select the TFSOlapReportsDS Datasource
3. Create (or check) the ConnectionString. It should be along the lines of : Data source=TFS_SERVER;initial catalog=TFSWarehouse
4. Change the Connect Using to use Credentials stored securely in the report server. Set this to use your fixed TFSReports login and password.
5. Check Use as Windows credentials when connecting to the data source
6. Repeat this for TfsReportDS

All should work now !

Friday 22 February 2008

Merge replication CPU usage goes to 100%

There is an issue with merge replication whereby the MSmerge_contents and MSmerge_history do not tally. If this occurs then the only solution (so far) is to drop and recreate the database and replication.

The work-around to prevent this happening is :

update sysmergepublications set [generation_leveling_threshold] = 0

This stops generations being created in the first place, so the issue is avoided.

Parameter Sniffing - SQL executes faster as a query than as a stored procedure

This strangely monickered problem caught me out last week.

The scenario is that you have a piece of SQL that runs quicker as SQL executed in query analyzer than as a stored procedure. The reason for this is a little obtuse, and worth remembering.

It seems that SQL Stored Procedure Plans can have issues with parameter variables. This affects both SQL Server 2005 and 2000. The official line is :

"Parameter sniffing" refers to a process whereby SQL Server's execution environment "sniffs" the current parameter values during compilation or recompilation, and passes it along to the query optimizer so that they can be used to generate potentially faster query execution plans. The word "current" refers to the parameter values present in the statement call that caused a compilation or a recompilation"

See Batch Compilation, Recompilation, and Plan Caching Issues in SQL Server 2005


What this means in practice is that the stored procedure plan may be optimized for a particular result set volume, and when the stored procedure is executed with different parameters the plan is no longer efficient.

There are a few things that can be done to circumvent this issue :

1. Only use local variables in your procedure, and assign them to parameter variables at the start. This forces SQL to create a generic plan.
2. Call the stored procedure WITH RECOMPILE each time. This will force a new plan to be created which may be quicker than using the existing optimized plan.
3. Move statements into sub stored procedures - the plans will be smaller and more reliable.
4. Encode any statements that reference the variables into string variables and sp_executesql them.

In the case that I had I reduced a heavy stored procedure from over 4 minutes to about 30 seconds. However, in my case only items 3 and 4 made any descernable difference.

It's a wonder this doesn't happen more often - it seems to be a strange catch out that only appears in certain circumstances.

For a detailed example of this look on Omnibuzz's blog

Thursday 21 February 2008

Deploying Office Tools SE with VS 2005 / Creating Custom Bootstrap

I'm in the process of working out how to include Office 2007 PIA/VSTO redistributable packages with my application (Office 2007 Ribbon app). Unfortunately these packages do not appear in the Pre-requisites is VS 2005.

As part of my investigations I stumbled on a nice Custom Bootstrap Manifest Generator written by David Guyer. Looks to be very useful, yet to use but worth bookmarking.

Find it here : Bootstrapper Manifest Generator

Also, I have found a nice article on how to deploy Office 2003/2007 solutions using Windows Installer. It's a 2 part walkthrough.

Of major interest is the Package.xml file for PIA (2003/2007).

I've run through this and it all works - though not exactly how I wanted it to. You have to execute the setup.exe in order to install the .Net 2.0, VSTO and PIA. In is interactive, requiring some clicks on next buttons. I really wanted it to be silent, but I guess as this only needs to be run the once then it is OK. The MSI can be executed silently thereafter to install updates so it's good enough.

Wednesday 20 February 2008

Profiler and Perfmon

Found this article about how to combine Profiler and Perfmon - very useful for analysing performance of particular SQL activities.

Integrating Profiler and Perfmon Log Files

SQL Server 2005 : Procedure Cache memory usage keeps growing

We have had an issue with a clustered instance of SQL-Server 2005 whereby it would run out of memory. Several things were tried but didn't seem to make any difference.

The server in question was RTM, and due to changes in infrastructure could not be upgraded to SP2. Without it being SP2 we were not able to properly ascertain the issue.

A new installation was created and updated to SP2 (3050) and the majority of the applications were moved.

We have been monitoring the memory and so far the SP2 seems to have sorted the issue. Further investigation has highlighted an issue with SQL 2005 RTM whereby the procedure cache is not properly flushed. Our investigation seem to agree with this issue. The monitoring needs to over a few days but so far it is looking OK.

The SQL used to monitor the cache is as follows, we also used Idera to monitor visually :


-- Get the size of the Plan Cache (CACHESTORE_SQLCP is non-SP and CACHESTORE_OBJCP is SP)
SELECT (SUM(single_pages_kb) + SUM(multi_pages_kb) ) / (1024.0 * 1024.0) AS 'Plan Cache Size(GB)'
FROM sys.dm_os_memory_cache_counters
WHERE type = 'CACHESTORE_SQLCP'
OR type = 'CACHESTORE_OBJCP'

-- Top 10 consumers of memory from Buffer Pool
SELECT TOP 10 type, sum(single_pages_kb) AS [SPA Mem, Kb]
FROM sys.dm_os_memory_clerks
GROUP BY type
ORDER BY SUM(single_pages_kb) DESC

-- UseCounts and # of plans for Adhoc plans
SELECT usecounts, count(*) as no_of_plans
FROM sys.dm_Exec_Cached_plans
WHERE cacheobjtype = 'Compiled Plan'
AND objtype = 'Adhoc'
GROUP BY usecounts
ORDER BY usecounts

-- Find the ad-hoc queries that are bloating the plan cache
SELECT TOP(100) *
FROM sys.dm_Exec_cached_plans
CROSS APPLY sys.dm_exec_sql_text(plan_handle)
WHERE cacheobjtype = 'Compiled Plan'
AND objtype = 'Adhoc' AND usecounts = 1
--AND size_in_bytes < 200000
ORDER BY size_in_bytes DESC

Tuesday 19 February 2008

TFS : The permissions granted to user are insufficient for performing this operation. (rsAccessDenied)

I got the error message :
The permissions granted to user are insufficient for performing this operation. (rsAccessDenied) on a user today when running the reports from Team Foundation Server.

This was an existing user who had not experienced this particular error before. I had just created a new project, and had assigned the AD user group (using the excellent Pwoer Tool - Team Foundation Server Administration Tool) as a contributor/publisher. I even created them as a standalone user and still no joy.

After some web crawling I found that there is an interesting anomoly in Reporting Services. The role "Publisher" by default does not have the ability to view reports. OK....

To resolve the issue :

1. Open SSMS and connect to the Reporting Services instance.
2. Plus open security..roles
3. Open the Properties window for the Publisher role and check :
View Reports
View Resources
View Folders
View Models
4. Click OK.

For more information see :
Lesson 1: Setting System-level Permissions on a Report Server
and
Lesson 2: Setting Item-level Permissions on a Report Server

In the beginning...

I blame Owen for this. I've always kept notes but never centralised them, so this blog is an attempt at recording snippets of (hopefully) useful information that I have stumbled on as part of my working life. Some of it may be useful to others, but mostly it will be useful for me to refer to as my ability to retain information long term aint what it used to be.



Enjoy :)