There are a wide range of scheduling options that are made available to a Boomi application once it is deployed. Depending on the inbound connector used in the application, once deployed the process can run on a schedule or it can continuously poll some existing data source. How the process runs is dependent on if the inbound connector being a “Get” operation or a “Listen” operation.
A “Get” operation allows the process to get a set of inbound records from a data source. Upon deployment this process can either be triggered manually or can have a schedule set up so that it runs on set intervals, for example perhaps it is desired that the process runs every Tuesday and Thursday of every week.
A “Listen” operation allows a data source to be continuously polled and will retrieve data as it comes in. An…
While the Dell Boomi user interface is highly robust and user friendly there are times when using the user interface can prove to be quite tedious. One example that we have discovered to be an example of this are process deployments. In order to deploy a set of Boomi processes one must search for each individual process and deploy. This may not seem overly tedious if AtomSphere only has a few processes. But in some cases AtomSphere could have numerous applications each consisting of many processes. It is easy to see given the scenario how frequent deployments could prove to be very time consuming.
Luckily, there is an alternative to using the AtomSphere user interface. Dell Boomi offers a set of APIs (both REST and SOAP) that can be used for various sets of tasks and that includes deploying Boomi processes….
On one of our current projects after completing initial development we got to a point where we needed to test our Boomi application to see how it could handle very large EDI files. For our first test we tested an 834 EDI test file with many enrollments and while doing this testing we got unexpected outputs from our maps that we had not previously seen. It was tough to really gauge in our application to see what exactly was going on in the mapping but the fact that at one point it was working with smaller files led us to assume that this was an error in our profiles that was getting exposed.
On one of our Dell Boomi projects we had a simple task which was to create an XML profile by importing an XSD file that was generated using BizTalk‘s schema generator. While this task seemed trivial at the time, when importing the file that was provided the following error was generated: “Error Occurred: Error on line 1: Content is not allowed in prolog.”
On a recent project, I was given the requirements to connect MuleSoft to Microsoft’s Azure Service Bus Queue. Logically, the first thing I did was look at the available connectors MuleSoft offers on their exchange site. After a quick two second search I found that MuleSoft does indeed have a connector just for this. The problem I faced was that this connector ran on the enterprise edition of MuleSoft, which of course is a paid service. For my particular task I had to do this using Mule’s community edition so therefore this connector could not help me. Fortunately, there is a nice and easy way to connect to the Azure Service Bus Queue using just one HTTP connector. This blog post will show you how I was able to create this connector.
As most people probably reading this blog post have already discovered, the Azure Service Bus Queue is a very powerful tool that can be used for a variety of scenarios. If you have a .NET application that is communicating with the Service Bus Queue, then chances are you are using this NuGet package to do so. But then this begs the question: “What if I am not using .NET but I still need to communicate with the Service Bus Queue”?
The good news is that in addition to a .NET client, there also exists a REST client for communicating with the queue. The bad news is that in the request header we need to pass through a Shared Access Signature (SAS) authorization token, and Azure does not provide any “out of box” way of retrieving this token. If you attempt to…
On my quest to master MuleSoft Restful APIs I encountered this walk-through on MuleSoft’s documentation site which was exactly what I was looking for: a low level step by step guide on the entire restful API lifecycle using MuleSoft and RAML. While reading through the steps of this walk-through it was apparent early on that it was done on a previous version of the Mule Runtime than the one I had currently. As of the date of this blog post the walk-through contains instructions for the 3.5.1 runtime while mine was for the 3.6.1 runtime. Overall, it didn’t seem like much of a difference, and any differences I spotted were easily applicable to my current version… that is until I tried to deploy the project to CloudHub. This blog post is to help anybody else who is following along with…
While going through all the installation steps for setting up MuleSoft on my machine, I normally found the instructions on their site to be very detailed and informative. However, I encountered some difficulties while downloading and setting up the Mule Enterprise Runtime with MMC (Mule Management Console). The instructions MuleSoft provides can be found here. The instructions seemed pretty straightforward, but when I went to the URL provided to reach the MMC, my browser was unable to establish a connection.
So I decided to write this blog post with the steps that I took in downloading the Mule Enteprise Runtime, the issues I encountered, and how I went about resolving those issues on a Windows machine.
Here is the list of steps that I took:
On a recent client I had the task of making one of our already existing Dynamics CRM reports use pre-filtering. This is normally a very straight forward procedure and you can find steps to do this on the following link: https://msdn.microsoft.com/en-us/library/bb955092.aspx.
Now given the complexity of the query used to populate this specific report I needed to use explicit pre-filtering. Once again, this is something that should be fairly straight forward to do. After I finished adjusting the report to use explicit pre-filtering I then deployed my report out to CRM. But when I tried running the report I got the following error: The expected parameter has not been supplied for the report