I came across an infuriating issue today after trying to re-enable WSManCredSSP after disabling it to get screenshots for my last blog post on setting up PSRemoting. (you’re welcome!)
When attempting to execute Enable-WSManCredSSP I got the following error:
Enable-WSManCredSSP : This command cannot be executed because the setting cannot be enabled. This can happen if no network connection is present.
At line:1 char:20
+ Enable-WSManCredSSP <<<< -Role Client -DelegateComputer “leg-dev-shpt-01.legis.local”
+ CategoryInfo : InvalidOperation: (System.String:String) [Enable-WSManCredSSP], InvalidOperationException
+ FullyQualifiedErrorId : WsManError,Microsoft.WSMan.Management.EnableWSManCredSSPCommand
I double-checked my security policy and ensured that PSRemoting was enabled on the server, but I was still unable to get it to execute. I even bounced the server and still had no luck, and running the PowerShell session in both Administrative and non-admin modes had no effect.
I came across a windows server forum thread on Enabling and Using CredSSP that had the answer. You…
This post describes setting up PSRemoting to allow execution of PowerShell commands on your SharePoint 2010 Server instance from your FAST Search server instance. These instructions should work however between any two machines on the same domain.
Setting Up PSRemoting
First, PSRemoting needs to be enabled on both the FAST and SharePoint servers. Launch a PowerShell session as Administrator on each machine, and run the following command:
You will then be met with various prompts, asking you to approve execution of PowerShell commands to enable remoting. Answer Y for each and hit Enter.
Enabling CredSSP Authentication on the Client (FAST)
The authentication type we will be using is called CredSSP. This needs to be enabled as well on both the client and server machines.
On the FAST server (which will be our client), execute the following command:
Enable-WSManCredSSP –Role Client –DelegateComputer sharepointcomputer.mydomain.local
You will see the following…
While attempting to execute a FAST Server back-up following the directions on TechNet, we ran into the following error:
Cannot find type [Microsoft.SqlServer.Management.Smo.Server]: make sure the assembly containing this type is loaded.
This was an issue we hadn’t experienced in production, and were unsure how to resolve.
After searching around for a while, I found the following post on blogs.like10.com that describes the solution.
Apparent FAST requires SQL management objects, CLR types, and additional PowerShell commands to successfully run the FAST Backup.ps1 file.
From the blog:
In our case we are using SQL Server 2008 R2 so the files we require are part of the Microsoft SQL Server 2008 R2 Feature Pack and we need to download the following components (install in order listed):
Microsoft System CLR Types for SQL Server 2008 R2
Microsoft SQL Server 2008 R2 Shared Management Objects
Microsoft Windows PowerShell Extensions for SQL Server 2008…
Ever have to deal with Crawled Properties in PowerShell, or look through FAST Search FFDumper logs and try to figure out what the type numbers mean? I have found a couple of random blog posts that cover some, but not all of them, and I’ve tried to find a definitive list.
I finally came across a TechNet page for the New-FASTSearchMetadataCrawledProperty FAST PowerShell command, which lists this page as the source of the Variant Type codes:
From that page (for posterity’s sake):
A property with a type indicator of VT_EMPTY has no data associated with it; that is, the size of the value is zero.
This is like a pointer to NULL.
1-byte signed integer.
1-byte unsigned integer.
Two bytes representing a 2-byte signed integer value.
2-byte unsigned integer.
4-byte signed integer value.
4-byte unsigned integer.
4-byte signed integer value (equivalent to VT_I4).
4-byte unsigned integer (equivalent to VT_UI4).
FAST Search Server 2010 comes with a web application for executing FQL queries against the FAST index, called QRServer. One of our clients was having an issue executing FQL queries, only getting data back that was being crawled by the FAST Web Crawler, and none of the SharePoint data being crawled by the FAST Content SSA.
The issue stemmed from permissions on the FAST server. Our client did not wish to add all of the developers accounts to FASTSearchAdministrators, and he found a solution/workaround on TechNet forums:
To Disable Security FQL stage from Query pipeline follow the instructions below. For those of you who are not familiar with this stage, the securityfql stage checks the security rights of the logged in user against docacls in each search results and filters those results that the current user has rights to see. Also,…
In a recent client solution we were utilizing a multi-valued Managed Metadata field to classify documents in a organizational taxonomy. Our search and navigation relied on being able to retrieve documents by supplying any one of the possibly multiple values stored in the Managed Metadata field.
We determined however, that even though many documents had multiple values in this field, they were returning only one value in the managed property.
We examined the crawled property feeding the managed property and found IsMultiValued = false, and figured that was the issue. It turns out, this property is not used in FAST.
From Multi-Value Property Support in FAST Search Server for SharePoint (en-us):
Configuring multi-value properties
All crawled properties supports multi-value data. Note that the crawled property configuration in the index schema contains a property named IsMultiValued. This is not used.
When you map to a managed property …
Crawls in our Fast Search content all of the sudden started “hanging”, and reporting reporting 0 successful crawls, with some errors.
We were seeing no errors to speak of in the event logs or in the ULS logs on either SharePoint server, and the FAST Server was reporting no errors.
Expired Self Signed Certificate for Content SSA Communication
Like many people, we followed the setup instructions on TechNet for setting up the FAST Search Server 2010 for SharePoint 2010 Farm found here:
During that setup process, we used the supplied FASTSearchCert.pfx self-signed certificate to secure the Content SSA communication. That certificate has a one-year expiration date. Exactly one year after configuration of the servers, crawl ceased to function.
One solution is to merely generate a new Self-Signed certificate via Micrsosoft’s instructions here:
This will however land you in the same situation a year from…
When we were attempting to crawl an RSS feed for a client using the FAST Web Crawler, we found the following error in our RSS Crawl Logs (located at \FASTSearch\var\log\crawler\node\fetch\<content collection name>)
“Invalid RSS MIME type: text/xml”
We found that the MIME type being indicated by the server was indeed “text/xml”. However, we were able to crawl another feed that used ‘text/xml’.
See screenshot below. The feed on the LEFT crawled properly, the feed on the right did not.
The only difference we were able to find was the working feed had the xml version tag, while the non-working one did not.
In order to ensure FAST can properly crawl your RSS feeds, make sure they are being returned as MIME type: application/rss+xml.
We were able to have our client change the MIME type of the RSS feed, and although it was formatted exactly it was…
While attempting to set up the FAST Web Crawler for a client, I was unable to find a way to force FAST to re-crawl data immediately. The web crawler uses an internal scheduler to schedule fetch operations.
There are a large number of options for the crawleradmin.exe command line tool that looked like they might work:
(from crawleradmin.exe reference on technet)
but none of them forced an immediate crawl.
I found that using
crawleradmin –d <content collection name>
to delete the data in the FAST Web Crawler content collection, then using
crawleradmin –f <crawler config .xml file>
forced an immediate recrawl.
Note: Deleting the data from the FAST Web Crawler content collection DOES NOT delete all of the data from the FAST content collection with the same name. Any data crawled through your FAST Content SSA will still be intact after running crawleradmin –d.