Tallan's Technology Blog

Tallan's Top Technologists Share Their Thoughts on Today's Technology Challenges

SSRS Error After Cumulative Updates Applied

Brian Zebarth

After installing windows updates, SharePoint 2013 Service Pack 1 and the March 2016 Cumulative Update some users reported a problem with SSRS reports. They were receiving the following error message.
“Microsoft.ReportingServices.Diagnostics.Utilities.ItemNotFoundException: The item ‘http://sharepoint/sites/SiteName/LibraryName/FileName.rdl‘ cannot be found.”
The error was occurring sometimes as soon as they clicked the link to the RDL file and sometimes the report viewer would come up and the error appeared when attempting to run the report.
I reviewed the ULS Logs and found the following errors:
“Cannot find site lookup info for request Uri http://webservername:12345/{guid}/ReportExecution.svc.”
“Microsoft.ReportingServices.Diagnostics.Utilities.ItemNotFoundException: The item ‘http://sharepoint/sites/SiteName/LibraryName/FileName.rdl‘ cannot be found.;”
I did some research on the errors and found a few posts from people who had similar issues and they had success deleting and re-adding the Reporting Services Service Application. I decided I didn’t want to do that if I didn’t have to so I thought I would try…

Deploying WCF Services with BTDF

Mike Agnew

There are many situations where we need to deploy WCF Services that we can use to interact with BizTalk, but if you’re constantly deploying to new systems, the process of manually creating and configuring them can become very time consuming and tedious. Leveraging the abilities of the BizTalk Deployment Framework (BTDF) we can automate this process along with your typical BTDF deployment.
In this guide we’ll step through how to configure BTDF to build the deployment of our existing WCF Services into the installer which BTDF creates for deployments.
In order to follow this guide it’s assumed that you have the following prerequisites:

A BizTalk solution and environment with BTDF installed
Already deployed WCF Services which you wish to replicate via deployment on the same machine
A basic knowledge of BizTalk, WCF, and IIS

First up we need to move our services into our solution folder. Simply…

Entity Framework Core Thumbnail

EF6 vs. EFCore: Source Code Beauty

Aesa Kamar

I was doing some comparative analysis on some of the libraries in Apache Spark when I stumbled upon this project, CodeFlower by fzianinotto. It represents source code visually by graphing each source file, linking them by their directory and dependency structure, sizing them by count of lines of code, and coloring them accordingly. The images are sized proportionately.
So I took a look at the two versions of Entity Framework that the DotNet team at Microsoft has been supporting and visualized them.
They probably make nice wall posters.

Entity Framework Core
Entity Framework 6
To learn more on how Tallan can transform your data into actionable insights and help you capitalize on business opportunities, CLICK HERE.

Part 1 – Analysis Services Multidimensional: “A duplicate attribute key has been found when processing”…

Mark Frawley

Introduction – Part 1
The most common and dreaded error that may occur when processing a dimension in Analysis Services Multidimensional (MD) is undoubtedly “Errors in the OLAP storage engine: A duplicate attribute key has been found when processing: Table: …. “   ‘Common’ because of both the poor data integrity frequently found in cube data sources, and improper modeling of the data in MD. ‘Dreaded’ because a number of distinct situations all give rise to this error and it can be difficult to diagnose which one is the actual problem. There is enough to explore around this error that this is Part 1.  There will be a Part 2 subsequently.
The full error message looks like “Errors in the OLAP storage engine: A duplicate attribute key has been found when processing: Table: ‘TABLENAME’, Column: ‘COLUMN’, Value: ‘VALUE’. The attribute is ‘ATTRIBUTE’.”  TABLENAME will…

Introduction to adding Datasets using Power BI API

Iu-Wei Sze

I recently worked on a POC using the Power BI API. The purpose of the POC was to be able to add a dataset from a SQL database into my Power BI workspace through the use of a console application instead of manually having doing it from the GUI in Power BI. The goal was to make it abstract enough so that we can use it for any database in SQL by simply adjusting a few parameters inside an app.config file.
To get an example of how one would use the Power BI API in visual studio, I referred to this github repository to be a base for my application. There are some things that the example doesn’t show you which is what I’ll be focusing on in this article.
Since the purpose of this article is to go over lesser known features…

BizTalk 2016: Importing Tracking Settings

Chetan Vangala

Microsoft has provided us a lot of nifty new features with BizTalk 2016, and among them are a few different ways to control the various application settings while importing and exporting applications. We’ve discussed some of these new additions in a previous post, but today we’ll talk about a new checkbox which allows us to control whether or not the tracking settings will be imported.

Working with Web Services Using the Boomi API

Mohammed Malick

In this blog we are going to discuss how we can use Boomi API to overcome some of the limitations of using Web Service Server connectors.
Here are some of the limitations we discussed in our previous blog.

In REST service, how do we restrict the process to support specific REST verbs, for example say the process should accept only the GET/POST and reject DELETE/PUT.
Passing queries/parameter in the URL itself
Something like this http://SERVER:9090/ws/simple/getBlogs/blogid/1 or this http://SERVER:9090/ws/simple/getBlogs?blogid=1
Managing or assigning multiple web services end point to a Boomi process

This blog post will discuss ways to work with the Boomi AtomSphere API to work around these limitations
Before we begin using Boomi API, we need to verify access to the Boomi API.
Login into Boomi portal, In the top right menu, select API Management.

If you are able to see below screen then you have the access to…


Adding R Packages In Azure ML

Iu-Wei Sze

The Execute R Script module in Azure Machine Learning is incredibly useful for manipulating data in ways that other modules do not cover. Its functionality can be further expanded by adding R packages that are not included in Azure ML by default. We will first show you how to get a list of packages that are already in your workspace and then how to add additional packages.
Checking Which R Packages are in Your Workspace
Create a new experiment, and place the following R code in an “Execute R Script” module:
data.set <-data.frame(installed.packages());

Run the experiment. The output of the Execute R Script module will be a list of the available packages.
Adding R Packages
Before you can use the package in Azure ML, you need to set up the zip file structure in which ML expects the packages to appear. To do this, start by installing the…

Test Harnesses and the Use of Exception Shapes

While testing an integration it is always important to perform a series of negative test cases to ensure the process can fail gracefully through error handling. It is not always possible for endpoint environments to be set up for quick and easy testing. For this reason, a useful step in the development process is to create test harnesses. These can act as the implementation of abstract test cases beyond simple shape-to-shape testing. Test harnesses are also incredibly useful for regression testing.

Applying new NIST standard to Asp.Net Pt. 1 (PBKDF2, SHA256, Password content)

Jeremy Mill

Most developers know that you should never store passwords in plain text, and know that they should be hashed. Only slightly fewer know that they should be stored utilizing a “salt” to append to the password to prevent time trade-off attacks (1). Fewer know what hash function they should use, and it seems lately, the majority don’t know that they shouldn’t just be salting and hashing at all, and instead should be using a key derivation function such as PBKDF2, or scrypt. We will be exploring utilizing PBKDF2, but scrypt is a perfectly viable option. The current draft of the new NIST guidelines says (2):
Verifiers SHALL store memorized secrets in a form that is resistant to offline attacks. Secrets SHALL be hashed with a salt value using an approved hash function such as PBKDF2 as described in [SP800-132]. The salt…