Custom Pipeline

Microsoft provides lots of feature in BizTalk Pipeline with having inbuild pipeline components. E.g. JSON Encoder, Decoder, MIME/SMIME Decoder etc.

But sometimes we need add something in context of message then we need to do customization in pipeline, BizTalk provides the facilities to customise with help of Custom pipeline. C# and VB.Net languages are basically used to write code for this.

Below are few scenarios where you need to apply Custom Pipeline

  • Conversion from another format to XML vice-versa:

As we know BizTalk has the facilities to receive file in XML, Text, csv and json (from BizTalk 2013). But if you have any other format e.g. Excel or PDF, then you need to change the format from Excel to xml because BizTalk always plays in xml format. So here in custom pipeline you can write code in C# or VB.Net form.

  • Add namespace (this facility is existed in BizTalk pipeline, when use ESB pipelines)
  • Promote message context property on custom data
  • Set dynamic send port
  • Message archiving
  • Handling large message

Requirement To create a Custom Pipeline:

Dlls:

  • BizTalk.ExplorerOM.dll (C:\Program Files (x86)\Microsoft BizTalk Server 2013 R2\Developer Tools\Microsoft.BizTalk.ExplorerOM.dll)
  • BizTalk.Pipeline.dll (C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE\PublicAssemblies\Microsoft.BizTalk.Pipeline.dll)
  • BizTalk.Streaming.dll (C:\Program Files (x86)\Microsoft BizTalk Server 2013 R2\Microsoft.BizTalk.Streaming.dll)

Pipeline Interfaces:

  • IBaseComponent
  • IComponent
  • IComponentUI
  • IPersistPropertyBag
  • IAssemblerComponent
  • IDisassemblerComponent
  • IProbeMessage

IBaseComponent Interface:

Defines properties that provide basic information about the component. Below are methods used in this component.

  • Description: Describe pipeline component description.
  • Name: Define the component name.
  • Version: Define the component version.

    public interface IBaseComponent

    {

        string Description { get; }

        string Name { get; }

        string Version { get; }

    }

IComponent Interface (COM)

Defines the methods used by all pipeline components except assemblers and disassemblers.

  • Execute: Executes a pipeline component to process the input message and get the resulting message.

   public interface IComponent

    {

        IBaseMessage Execute(IPipelineContext pContext, IBaseMessage pInMsg);

    }

IComponentUI Interface (COM)

Defines methods that enable pipeline components to be used within the Pipeline Designer environment.

  • Icon: Provides the icon that is associated with this component.
  • Validate: Verifies that all of the configuration properties are set correctly.

     public interface IComponentUI

    {

        IntPtr Icon { get; }

        IEnumerator Validate(object projectSystem);

    }

IDisassemblerComponent:

A disassembling component is a pipeline component that receives one message on input and produces zero or more messages on output. Below are two methods used in this component.

  • Disassemble: Performs the disassembling of the incoming document message.
  • GetNext: Gets the next message from the message set that resulted from disassembler execution. Returns NULL if there are no more messages.

    public interface IDisassemblerComponent

    {

        void Disassemble(IPipelineContext pContext, IBaseMessage pInMsg);

        IBaseMessage GetNext(IPipelineContext pContext);

    }

IAssemblerComponent

An assembling component is a pipeline component that receives several messages on input and produces one message on output. Assembling components are used to collect individual documents into the message interchange batch.

  • AddDocument: Adds the document message to the list of messages that will be included in the interchange.
  • Assemble: Builds the interchange from the messages that were added by the previous method. Returns a pointer to the assembled message.

     public interface IAssemblerComponent

    {

        void AddDocument(IPipelineContext pContext, IBaseMessage pInMsg);

        IBaseMessage Assemble(IPipelineContext pContext);

    }

IProb Message:

Defines the methods and properties for components that need probing functionality. Any pipeline component (general, assembling, or disassembling) can implement the IProbeMessage interface if it must support message probing functionality. A probing component is used in the pipeline stages that have FirstMatch execution mode. The IProbeMessage interface exposes a single method, Probe, which enables the component to check the beginning part of the message. The return value determines whether this component is run.

    public interface IProbeMessage

    {

        bool Probe(IPipelineContext pContext, IBaseMessage pInMsg);

    }

IPersistPropertiesBag Component:

Defines the methods to prepare for, load, and save the properties of pipeline components during design time.

This is responsible for getting the design time properties. If you need to have some properties be set during design time or during the deployment stage, you must add the loading and saving functionalities for those properties.

    public interface IPersistPropertyBag

    {

        void GetClassID(out Guid classID);

        void InitNew();

        void Load(IPropertyBag propertyBag, int errorLog);

        void Save(IPropertyBag propertyBag, bool clearDirty, bool saveAllProperties);

    }

Custom pipeline Components:

You can create three types of pipeline components:

  • General: This component can be fit at any stage of Receive and Send pipeline,
  • Assembling: This component can only fit at assemble stage of Send Pipeline
  • Disassembling; This component can only fit at Disassemble stage of Receive Pipeline

Developing a General Pipeline Component

A general pipeline component is a .NET or COM component that implements the following interfaces:

 Developing an Assembling Pipeline Component

An assembling component must implement the following interfaces:

  • IBaseComponent
  • IAssemblerComponent
  • IComponentUI
  • IPersistPropertyBag

Developing a Disassembling Pipeline Component

 A disassembling pipeline component receives one message on input and produces zero or more messages on output. Disassembling components are used to split interchanges of messages into individual documents. Disassembler components must implement the following interfaces:

  • IBaseComponent
  • IDisassemblerComponent
  • IComponentUI
  • IPersistPropertyBag
Advertisements

BAM in Orchestration

In BizTalk, BAM  applies on pipeline level and in orchestration. Here we are going to implement BAM in Orchestration through BAM API programming.

For more details of BAM APIs, you can refer to

https://vkbiztalk.wordpress.com/2017/07/17/bam-api/

Normally for BAM Programming below three functions are required under EventStream class.

  • BizTalk.Bam.EventObservation.OrchestrationEventStream.BeginActivity
  • BizTalk.Bam.EventObservation.OrchestrationEventStream.UpdateActivity
  • BizTalk.Bam.EventObservation.OrchestrationEventStream.EndActivity

If you want some more advance feature in BAM tracking then you can refer below reference:

https://msdn.microsoft.com/en-us/library/microsoft.biztalk.bam.eventobservation.eventstream.aspx

Requirements:

Add below Dlls in BizTalk project references:

  • Biztalk.BAM.Xlangs.dll
  • BizTalk.Bam.EventObservation.dll

Write below code in orchestration expression shapes:

//Create a new, unique activity identifier to use as the ActivityID in BAM

string activityId = Guid.NewGuid().ToString();

Microsoft.BizTalk.Bam.EventObservation.OrchestrationEventStream.BeginActivity(ActivityName, activityID);

// Updates the activity record.

Microsoft.BizTalk.Bam.EventObservation.OrchestrationEventStream.UpdateActivity ActivityName, activityID, “FIleName”, msg(BTS.FileName), “ProcessStart”, DateTime.UtcNow,);

// End the activity record.

Microsoft.BizTalk.Bam.EventObservation.OrchestrationEventStream.EndActivity(ActivityName, activityID);

Here you can use different expression shape for above methods, Be sure EndActivity method should be in last of orchestration. And StartActivity methad in starting of Orchestration.

Below is explanation of data and function which are used in orchestration expression shapes:

ActivityName: BAM Definition File Name

ActivityID: It can be a GUID or any unique id

UpdateActivity function:

public virtual void UpdateActivity( string activityName, string ctivityInstance,                     params object[] data)

In Update activity, after first two parameters, data items are next parameters which are defined in key value pairs.

These data items can be defined in single UpdateActivity function as well as in separate activity function. E.g. In below BAM table, there are following data items.

Data1, data2, starttime, endtime, errordetails, status, lastmodified.

Here we can write update function as per requirements, it’s not mandatory to define all data items in one time.

UpdateActivity(“BAMSample”, ActivityID, “starttime”, DateTime.Now);

UpdateActivity(“BAMSample”, ActivityID, “Status”, “OperationStart”);

BAMTable.jpg

You can track info on BAM Portal or BAM definition table in BAM Primary Import Database.

When you defining BAM Definition file then BAM creates five table in BAMPrimaryImport table.

https://vkbiztalk.wordpress.com/2017/06/14/implementation-of-bam-activity-definition-file/

You can check data in Bam_<BAM Definition File>_Completed table.

e.g. bam_BAMSample_Completed

How does EventStream work in Orchestration?

The OES API stores tracking data first in the BizTalk MessageBox database. Periodically the data is processed and persisted to the BAM Primary Import database by the Tracking Data Decode Service (TDDS).

The OES API is found in the Microsoft.BizTalk.Bam.EventObservation namespace.

BAM in Custom Pipeline

In BizTalk, BAM can be applied on pipeline level and in orchestration. Here we are going to implement BAM in BizTalk through BAM API programming.

For more details of BAM APIs, you can refer to

https://vkbiztalk.wordpress.com/2017/07/17/bam-api/

Normally for BAM Programming below three functions are required under EventStream class.

  • BeginActivity
  • UpdateActivity
  • EndActivity

But if you want some more advance feature in BAM tracking then you can refer below reference:

https://msdn.microsoft.com/en-us/library/microsoft.biztalk.bam.eventobservation.eventstream.aspx

Normally, we write the code to produce BAM events in the Execute method of the pipeline component.

The pipeline context has a GetEventStream method that returns a MessagingEventStream.

Requirements:

Dlls:

  • BizTalk.Bam.EventObservation.dll
  • BizTalk.ExplorerOM
  • BizTalk.Pipeline
  • BizTalk.Streaming

Namespaces:

  • usingBizTalk.Bam.EventObservation;
  • usingBizTalk.Message.Interop;
  • usingBizTalk.Component.Interop;
  • usingBizTalk.Streaming;
  • usingBizTalk.ExplorerOM;

Example:

public Microsoft.BizTalk.Message.Interop.IBaseMessage Execute(Microsoft.BizTalk.Component.Interop.IPipelineContext context, Microsoft.BizTalk.Message.Interop.IBaseMessage inMsg)

{

      EventStream BAMes;

BAMes = context.GetEventStream();

string ActivityID; // it can be a GUID

String operation; // message context properties or any custom value to track in BAM

BAMes.BeginActivity(ActivityName, ActivityID);

BAMes.UpdateActivity(ActivityName, ActivityID, “Data”, operation);

BAMes.EndActivity(ActivityName, ActivityID);

BAMes.Flush();

}

Here ActivityName: BAM Definition File Name

ActivityID: It can be a GUID or any unique id

UpdateActivity function:

public virtual void UpdateActivity( string activityName, string activityInstance,                   params object[] data )

In Update activity, after first two parameters, data items are next parameters which are defined in key value pairs.

These data items can be defined in single UpdateActivity function as well as in separate activity function. E.g. if a BAM table has following data items.

Data1, data2, starttime, endtime, errordetails, status, lastmodified.

Here we can write update function as per requirements, it’s not mandatory to define all data items in one time.

BAMes.UpdateActivity(“BAMSample”, ActivityID, “starttime”, DateTime.Now);

BAMes.UpdateActivity(“BAMSample”, ActivityID, “Status”, “OperationStart”);

You can track info on BAM Portal or BAM definition table in BAM Primary Import Database. You can check data in Bam_<BAM Definition File>_Completed table.

e.g. bam_BAMSample_Completed

BAMTable.jpg

When you defining BAM Definition file then BAM creates five table in BAMPrimaryImport table. you can refer to following reference to create BAM Definition file.

https://vkbiztalk.wordpress.com/2017/06/14/implementation-of-bam-activity-definition-file/

How does EventStream work in pipeline?

MessagingEventStream (MES) is used inside a BizTalk pipeline component to write Bam as part of the messaging transactions ensuring that your BAM event persistence remains in sync with the BizTalk pipeline transactions.

Messaging Event Streams are asynchronous and store tracking data first in the BizTalk MessageBox database. Periodically the data is processed and persisted to the BAM Primary Import database by the Tracking Data Decode Service (TDDS).

 

BAM API

In BizTalk, BAM (Business Activity Monitoring) can be applied on pipeline and in orchestration. BizTalk also provide Tracking Profile Editor tool to implement BAM in BizTalk,  You can get info about tacking profile editor from below reference:  https://msdn.microsoft.com/en-us/library/aa547038.aspx.

But this tracking tool has some limitation (for more info you can refer to https://blog.sandro-pereira.com/2010/08/20/bam-limitation-of-tracking-profile-editor-tpe ),  that’s why we need to move to apply BAM through programming. Here we are going to implement BAM in BizTalk through programming. Before going to coding we need to understand concept of BAM API, which is used in BAM Programming.

The BAM API defines four main classes:

  • DirectEventStream (Synchronous, no latency): used in a .NET application to do a buffered write of data to BAM. Here data is persisted synchronously to the BAM Primary Import database. Except for DES, all other EventStream classes are asynchronous and have some latency. The data is first persisted to MessageBox and subsequently processed by TDDS which moves data into the BAMPrimaryImport database.
  • BufferedEventStream (Asynchronous, high throughput, some latency): used in a .NET application to do an unbuffered write of data to BAM.
  • OrchestrationEventStream (Asynchronous, participates in BizTalk orchestration transactions): used within an orchestration; Provides transactional consistency with the orchestration.
  • MessagingEventStream (Asynchronous, participates in BizTalk Server pipeline transactions): used within a pipeline; Provides transactional consistency with the messaging engine.

All of these classes are derived from the base class EventStream.

Note:

For BizTalk OrchestrationEventStream (For Orchestrations) and MessagingEventStream (For Pipelines) are applicable and other two event streams are applicable for non-Biztalk e.g. WCF and WF. Link is below for how to apply BAM through BAM API in BizTalk Pipeline and Orchestration.

Apply BAM in Pipeline:

https://vkbiztalk.wordpress.com/2017/07/26/bam-in-custom-pipeline/

Apply BAM in Orchestration:

https://vkbiztalk.wordpress.com/2017/07/27/bam-in-orchestration/

The EventStream APIs for BAM reside in the Microsoft.BizTalk.BAM.EventObservation namespace. To code against the APIs, you must add the following DLLs to your project reference and include the namespace in your code with the following using statement:

using Microsoft.BizTalk.Bam.EventObservation;

  • BizTalk.BAM.EventObservation.dll (Available at C:\Program Files (x86)\Microsoft BizTalk Server 2013 R2\Tracking)
  • Biztalk.BAM.Xlangs.dll (Available at C:\Program Files (x86)\Microsoft BizTalk Server 2013 R2\Tracking): This DLL is required when coding for Orchestration event streams.
  • BizTalk.Pipeline.dll (Available at C:\Program Files (x86)\Microsoft BizTalk Server 2013 R2): This DLL required when using coding pipeline contexts for Messaging event streams.

Serverless Architecture (FaaS)

Serverless architecture is hot topic in today’s market in cloud computing. Its provide a great feature in Cloud computing. After coming IaaS (Infrastructure as a Server), PaaS (Platform as a Service) and SaaS (Software as a Service) architecture concept, this is new concept FaaS (Function as a Service). We can call FaaS as Serverless Architecture. It is providing great feasibility to developer for freely doing their code and innovation as well as company to save money as there is not require to take care of Production Server activity like load balancing, high traffic or highly available at all, Company need to pay per click or we can say payment depend on traffic on website. It’s very profitable for small and mid-level level companies.

Serverless architecture is described as that Cloud Server Provider will take care of all the server related activity and you need to only focus about coding. Cloud Server Provider provides you a container to deploy your code and press the execute button. Rest of works will be take care of Providers.  In logical term we can say, FaaS is sited between PaaS and SaaS. Amazon Lembda and Micosoft Azure Function, Google Cloud Functions, Microsoft Azure Functions, IBM OpenWhisk with an open source implementation, Iron.io, and Webtask are providing this kind of Service.

Serverless1.jpg

serverless computing supplements DevOps because it frees up developers and IT operations staff from having to set up and tune systems. Serverless code is typically triggered by specific events, meaning users need only pay for virtual machines used once the code has been triggered.

Serverless Architecture

Before understanding about FaaS, we should know services which are provided in Cloud Computing as below:

Cloud.jpg

Now one more Service has been introduced by Cloud Computing as FaaS (Function as a Service)

FaaS (Function as a Service)

It is a cloud platform model that can deliver the true promises of cloud computing: infrastructure abstraction, scalability, ease of consumption and value pricing.

FaaS is about running back end code without man aging your own server systems or your own server applications. Deploy your applications as independent functions, that respond to events, charge you only when they run, and scale automatically.

Function.jpg

  •       Scalability: This is one of the best feature in FaaS. Its provides scalability in a serverless environment, the ability to scale an application to meet user demand is handled by the platform hosting the code. If an application has 10,000 or 10 million users, it doesn’t matter. That eliminates operational concerns about pre-provisioning or over-provisioning servers.
  • Cost benefits:  Traditional runtime models have processes that constantly run, and the user pays for them even when they’re not being utilized. A serverless environment can be more cost effective because you’re not paying a fixed cost per instance deployed, but instead for the time those instances are actually doing work.
  • Event-Driven: The fundamental difference between PaaS and Serverless Computing lies in the way the code is executed. Developers write code that is autonomous and independent of other components and services. Each component is invoked only when an event takes place. By connecting the dots, developers can define the sequence in which the invocation happens at runtime. For example, developers can easily change the logic of sending a push notification to a device instead of a text message without changing a single line of code. All they got to do is to change the flow of events.
  • Support Multiple Language: Serverless Computing is the ability to support multiple runtimes, languages, and frameworks. Developers can choose their own languages to implement the fine-grained functionality. Independently deployable units will be connected at runtime to deliver the required workflow. That means developers are not restricted to one language or runtime to implement the logic

Where to Apply

  • Timer-based processing
  • Azure service event processing
  • SaaS event processing
  • Serverless web application architectures (WebHook URL)
  • Serverless mobile back ends
  • Real-time stream processing (IoT)
  • Real-time bot messaging (Chatbox)

How does it help organization?

  • Put your infrastructure in the hands of experts
  • Liberate your developers.
  • Avoid being locked into a monolithic provider.
  • Cost Effective

Drawbacks

  • Vendor control
  • Multitenancy Problems
  • Vendor lock-in
  • Does not support long running process.

If you are going to use Amazon Lembda, currently it has below limitation:

  • The code must be less than 250 MB uncompressed, or 75 MB compressed
  • It must run for no more than five minutes
  • It can access no more than 512 MB of ephemeral storage

references: