Geeks With Blogs

News View Michael Stephenson's profile on BizTalk Blog Doc View Michael Stephenson's profile on LinkedIn
Michael Stephenson keeping your feet on premise while your heads in the cloud

By Michael Stephenson

I've been wanting to play around with some of the Windows Azure features for a while and following Steve Marx presentation at the MVP Summit and the release of the Windows Azure SDK (March CTP) I decided to give it a go.

In a previous post I had discussed how in the future versions of BizTalk I think they should begin to look at how BizTalk should look to take advantage of some of the newer technologies within the product itself. One of my suggestions was around taking advantage of cloud storage as an option for HAT data.

This post isn't really intended to go into the details of my thoughts on this, but for my experiment with Azure blob storage I decided to look at storing copies of BizTalk messages in the cloud.

One of my old colleagues (Nick Heppleson) manages the Message Archiving Pipeline Component project on Codeplex so my idea was to look at modifying his pipeline component to use the cloud for storage rather than the file system. I could then send this on to Nick so he can consider it as a future enhancement for that project. In addition to this I intended to run my sample in BizTalk 2009. The rest of this post will discuss the steps I took.

Upgrading the Message Archiving Pipeline Component

The first step was to download the message archiving pipeline component. I then setup the source code from the codeplex project on my BizTalk 2009 VPC using the BizTalk 2009 CTP. I then opened the solution and let Visual Studio 2008 do the upgrade of the solution.

This was very straightforward and did not have any problems.

Nicks code base contains a library folder containing copies of a couple of the normal dll's you would reference when creating a pipeline component. I decided that I would at first not update these to the BizTalk 2009 versions of those assemblies and only make the change if I found problems. I went with the assumption that when deployed this will just pick up the newer versions from the GAC and it worked fine indicating there are not any breaking changes in these assemblies.

It would probably have been a better idea to have updated these but in this example I wanted to make as few changes as possible.

 

Adding the Azure Client Storage to the project

At this point I had the solution open and ready, I now copied the C# project from the Azure SDK samples for the ClientStorage project. Without going into too much detail this project would abstract me from the REST interface and give me an easy to use object model to interact with Windows Azure.

My intention is to use the blob storage features of Windows Azure.

 

Making Changes to the Pipeline Component

I now needed to make some changes to Nick's pipeline component so it would have the right information to interact with the cloud. I took the following actions:

  1. Firstly I removed the pipeline components properties and replaced them with properties so I can pass information from the BizTalk bindings to the pipeline component. The properties I needed are:
    1. The url for blob storage
    2. The account name for my Windows Azure project
    3. The secret key to get permission to my Azure project
    4. The container name where I would save messages to

       

  2. I changed the IPersistPropertyBag implementation to load and save my new properties
  3. I removed the file system specific code in the StreamOnReadEvent method
  4. I removed some of the additional logic for macros and other advanced features that the pipeline component supports. I wanted to make the component a little simpler for the purposes of this demo.

You can see in the below picture the modified Execute method that I continue to use the CForwardOnlyEventingReadStream that Nick originally used. This means that the message will be archived in a streaming fashion which fits well with good practices for pipeline component development.

 

I then implemented my new logic into the StreamOnReadEvent method which would save the message to cloud storage. You can see in the below picture how as the stream is read I will upload the information from the buffer into the cloud via the StorageClient API. I also have logic to workout if I need to create or update the blob.

 

Setting up the Azure Project

To be honest there isn't really that much to setting up the Azure project. Once you have registered and signed in to the Windows Azure developer Portal on the following link http://www.microsoft.com/azure/signin.mspx you can proceed to the add new project button. On the below diagram I have shown the type of project you should setup.

 

Once you are viewing the project you can see in the below picture that it provides you with the end points and keys to access your storage - obviously I have changed the key since posting this J

 

Your Azure project is now setup and ready to begin accepting data.

Setting up the BizTalk Solution

In order to use the pipeline component I setup a very simple BizTalk implementation. I'm not going to go into too much detail about this but the implementation is summarised below:

  • I will have a BizTalk Application with a receive location which will pick up a file from a folder
  • There will be a send port which will subscribe to the receive port and push messages out to another file
  • On the receive port I will setup a custom pipeline which will contain the Message Archiving Component in the decode stage of the receive pipeline
  • I will configure the bindings to have the pipeline configuration setup using per instance pipeline component configuration. The below picture shows the configuration

In the picture you can see the Azure settings. One key point to note here is your project name needs to be provided in lower case. And the same with the container name (I think the container name also cant contain numbers or special characters)

 

Testing the result

Testing the process is simple. Drop a file into the input folder and it will come out at the output folder. The next question is how do I see what is in the cloud?

I chose to use the Azure Explorer project from codeplex to let me examine what is in my cloud containers. In the below picture you can see the following:

  • Left Panel = The containers I have created
  • Top Right Panel = The blobs that have been uploaded
  • Bottom right Panel = The content of the blob I have chosen to display

 

 

Summary

Hopefully from this post you can see that it was relatively simple to start taking advantage of some of the Windows Azure features to compliment a BizTalk solution. While there are still a lot of hurdles to be overcome before the cloud is fully commercial and available to use it is good to think about the things this new paradigm offers us as solution developers and the benefits it can offer to solve some of the common problems we face.

In this post you can also see how simple it was to modify Nick's already excellent component to support this and hopefully it will be something Nick may offer in future versions. In terms of a design decision I certainly do not think that you should archiving to the cloud in all cases. There is the additional streaming overhead in doing this etc, but I feel that in certain circumstances it offers an interesting new choice. Remember that in the current message tracking to HAT there is enough different requirements and considerations to create a place for Nick's component in the first place. The additional benefits that storage to the cloud would offer is not having to worry about running out of storage or to have an archiving or backup policy. You would potentially just pay a cost based on the amount of storage you require.

Back at the start of the post my initial reason for suggesting that HAT could benefit from storage to the cloud would be that HAT essentially offers the option to do the actual archiving outside of the normal BizTalk process where as my modification to the Message Archiving Pipeline Component will be streaming this to the cloud as the service instance streams towards the messagebox. This obviously has potential issues around the connectivity to the internet and what to do if you have problems with the upload.

Anyway hopefully this provides some ideas for more in this area.

The modified code for this sample can be downloaded from the following location, but hopefully once Nick has looked over a copy of this he can possibly keep the sample on the codeplex page.

The code for this sample can currently be downloaded from the following link:

http://www.box.net/shared/0t5o39tddp

 

Posted on Thursday, April 2, 2009 1:10 AM BizTalk | Back to top


Comments on this post: Message Archiving Pipeline Component and Windows Azure

# re: Message Archiving Pipeline Component and Windows Azure
Requesting Gravatar...
i have readiing lot of blog but this is awesome free ebay gift card generator
Left by supali on Nov 16, 2017 4:55 AM

Your comment:
 (will show your gravatar)


Copyright © Michael Stephenson | Powered by: GeeksWithBlogs.net