MSDN provides an example of INSERTing large data into SQL Server, leveraging the WCF-SQL adapter’s built in FILESTREAM capabilities. However, it’s also possible to leverage the transaction enlisted by the WCF adapter in a custom pipeline to pull FILESTREAM data out of SQL Server more efficiently than the more common SELECT … FOR XML query which simply grabs the FILESTREAM content and stuffs it into an XML node in the resulting document.
Imagine, for example, you had a large document of some sort (XML, Flat File, etc.) to store in SQL Server that BizTalk would need to process from a FILESTREAM table defined like so:
The tradeoff here is losing the typed XML data in favor of more efficient storage and access to larger file objects (especially when the data will, on average, be large). This can make a vast difference if you have…
I wrote previously (here) about using SQL Server Profile to capture and debug SQL calls from BizTalk. This method works well when there are no errors actually calling the procedure (but you want to tune the procedure using ‘real’ data from BizTalk). However, it’s not as much help when BizTalk can’t call the procedure at all – because of an invalid conversion, or a malformed table type, or some other errors in the procedure that prevent the procedure from actually being run. Either no event will get logged in the Profiler, or you will see only the stored procedure name but no parameters (because BizTalk realizes the procedure won’t be able to correctly execute).
So you design your strongly typed stored procedure to take table types from BizTalk and it’s running great with your test cases. It works well through the unit testing, but then you start running larger jobs and suddenly SQL is choking on it.
Ever wish you could just run that SQL call directly in SSMS with those exact several thousand rows for the table type parameters, and step through it using the debugger? Well, you can using SQL Server Profiler (and/or Server Traces). I used this technique recently to help a client resolve a particularly thorny issue that came up when they tried to process some larger messages.
To walk through the process of doing this, I’ll use a database named BTSTrainDb with a stored procedure (dbo.usp_DemoTableTypeSP) that takes a user-defined Table Type (dbo.DemoTableType) as a parameter and then just selects * from…
Recently, while troubleshooting performance issues at a client, I came across a peculiar issue. The environment in question was a multi- server environment with multiple BizTalk Servers (2010) connected to a SQL Server cluster. The issue was extremely slow performance when trying to view, refresh, start or stop the host instances through the Administration Console. Now usually it takes a few seconds for host instances to start, stop, or restart, however, in this case just refreshing the status of the host instances was taking over a minute.
The problem was that one of the BizTalk Servers in the group was offline for some time, causing the host instances tied to that BizTalk Server, within the same group, to become unresponsive. This caused slowdown in viewing the host instances for any of the other servers in the BizTalk Group.
There are two ways to resolve this:
1. Turn on…
We have a great love for BizTalk 360 here at Tallan, and the many capabilities it affords to the realm of BizTalk monitoring, alerting, and governance. For those not familiar, BizTalk 360 is a web based portal designed in Microsoft Silverlight which monitors BizTalk environments and is designed to address the the common hurdles enterprises face when managing BizTalk environments.
When trying to send a message via the SMTP adapter in BizTalk we have the option of attaching messages as part of Multi-Part Messages. If the message being attached is too large the message will fail to send. In our application, we send email notifications of failed messages in BizTalk with the original message sent as an attachment.
An solution we devised was to compress the original message and attach it to the email. This solution, however, presented its own considerations.
We didn’t want to create a temporary directory on a server to create the compressed file in. We wanted to do all of the compression in memory.
Since we were compressing in memory we had to come up with a solution to attach the file to the email message.
The compression in memory is discussed in my posting on File Compression In Memory. …
The PAL (Performance Analysis of Logs) tool is a powerful tool that reads in a performance monitor counter log and analyzes it using known thresholds. A template is created from the PAL tool and imported to the server’s Performance tool as a new counter. This includes relative BizTalk and SQL metrics that will be tracked. The resulting blg files are analyzed by the PAL tool and thresholds are applied once questions are answered regarding number of CPU, memory, etc. A report is produced by the PAL tool in an easy to read format with graphs. Tallan will use this information to make recommendations for performance improvements.
Threshold files take the questions answered in the tool and apply them to the results. This is a quick and easy way to see where the weaknesses are without lengthy analysis.
An easy to use GUI…