Quantcast
Channel: Kent Weare's Integration Blog
Viewing all 75 articles
Browse latest View live

Dynamics AX 2012 R2 File based integration with BizTalk Server 2013

$
0
0

 

I am currently involved in a project where I need to integrate Dynamics AX with a 3rd party payroll system.  Dynamics AX 2012 provides a rich integration framework called Application Integration Framework or (AIF).  One of the core strengths of this framework is the ability to generate a typed schema through a configuration wizard.  Combine that with a AX’s ability to create inbound and outbound ports and you now have the ability to generate export(and import if interested) files rather quickly.

When I mentioned in the previous paragraph “typed messages” I meant that AX will generate XSD schemas that we can include in our BizTalk projects.  This is a breath of fresh air compared to some other ERP systems where you get handed a CSV flat file that you have to build a flat file schema for.

In my scenario I was receiving a list of Work Orders so a colleague working on the AX side was able to provide me with the Work Order Schema and an imported schema that includes AX types.  At this point I add the schemas into my solution, build my map and wire everything up.  Go to run an initial test and was presented with the following error:

Details:The published message could not be routed because no subscribers were found. This error occurs if the subscribing orchestration or send port has not been enlisted, or if some of the message properties necessary for subscription evaluation have not been promoted.

This is a pretty standard error message that basically comes down to BizTalk received a message that it was not expecting.  The reason why BizTalk was not expecting it was because AX wraps outbound messages with a SOAP Envelope as you can see in the image below.

 

image

SOAP Envelopes are certainly nothing new but I didn’t expect AX to use them when writing a file to the file system.  When receiving Web/WCF Services BizTalk automatically takes care of extracting the message body from the incoming SOAP message for us.  With the FILE Adapter that facility just does not exist.

You will notice in screenshot below that there is a namespace that is specific to AX.  This got me thinking that AX probably has an XSD for this message type as well.

image

After digging around a bit I did find the location of the AX schemas to be in the Program files\Microsoft Dynamics AX\60\Server\MicrosoftDynamicsAX\bin\Application\Share\Include folder.  The schema that I was looking for was called Message.xsd

image

Just adding this the BizTalk project was not enough.  I needed to make a few small tweaks to the schema:

  • Click the “Schema Icon” of the schema and then set the Envelope property to be True.  This instructs BizTalk that it is an envelope schema and when BizTalk sees this message that it needs to strip out the Envelope which in this case is a SOAP Envelope.

image

  • Set the Body XPath property by selecting the Root Node of the schema and then populating the appropriate value which in this case is

/*[local-name()='Envelope' and namespace-uri()='http://schemas.microsoft.com/dynamics/2011/01/documents/Message']/*[local-name()='Body' and namespace-uri()='http://schemas.microsoft.com/dynamics/2011/01/documents/Message']/*[local-name()='MessageParts' and namespace-uri()='http://schemas.microsoft.com/dynamics/2011/01/documents/Message']

image

We can now deploy our application. When it comes to our Receive Location that will be picking up this message, we want to ensure that we are using the XMLReceive Pipeline.  Within this Pipeline the XML Disassembler stage will take care of the SOAP envelope so that when the message body is presented to the MessageBox that  any subscribers will receive the expected message body.

Conclusion

When I first discovered that I was receiving a SOAP wrapped message my instincts said maybe AX could just use a WCF port instead of a FILE port.  This just wasn’t the case, there are only two options when it comes to configuring an outbound port: FILE and MSMQ.  Using MSMQ would not of helped me in this case as the same issue would have existed. 

AX certainly does provide the ability to call a WCF service but it is a more custom based approach.  I would have had to expose this schema as a WCF service but then my AX colleagues would have had to write code against the proxy to populate all of the different data elements.  This would have defeated the purpose of using the AIF framework in order to expedite the ability to delver a solution under very tight timelines.  Luckily with a little tinkering we were able to come up with a reasonable solution without writing custom code.

I have to think that AX is wrapping these messages in a SOAP Envelope for a reason.  Perhaps a WCF outbound port is coming in an upcoming release?


BizTalk360 - Monitoring Service High Availability Feature

$
0
0

 

I recently went through my second implementation of BizTalk360 and ran into a feature that I wasn’t previously aware of. Typically I have installed BizTalk360 on a BizTalk Server itself which posses a bit of a risk if you only install it on one BizTalk Server and that BizTalk Server happens to be offline. 

My current environment consists of a multi-node cluster (an actual cluster with Failover Cluster Services).  I recently asked the question to Saravana Kumar if this was the way to go when looking for a redundant monitoring solution.  He indicated that my idea would work and is completely supported however I may want to look into a new feature called Monitoring Service High Availability.  When using this feature, BizTalk360 itself is maintaining its state by storing it in the database.  In my case, One node will be active and the second node will be passive – much like a service being managed by Windows Failover clustering.

To access this feature click on the Settings link in the top right hand corner of the screen.

image

Next, click on the Monitoring Service High Availability link.

image

Even though the BizTalk360 Service is actively running on both Servers (in my case), BizTalk360 is designating one of the servers as being the primary.

image

We have the ability to change the primary server by selecting it and then clicking on the Bring Server Active button.

image

Instantly our primary will switch to becoming a secondary and vice-versa.  This was very quick.  Much quicker than I have experienced failing over a service using Windows Failover Clustering.

image

The next test is to take our primary Service (or Server Offline).  To do this I will just stop the BizTalk360 service.  By doing so I am simulating what would occur if our service stopped or we lost our entire primary server.  To make this test even more real I am going to enable a test alert, make sure I receive the first alert and then stop the BizTalk360 Service.  My expectation is that my second node will become primary and I should receive another test alert.  This time the alert will be generated from the newly activated node.

 

Below I have configured an existing alarm to function in  TEST MODE.

image

I have received my alert as expected.

image

I will now stop the BizTalk360 Service on Node 1.

image

If I navigate back to the Monitoring Services High Availability screen I find that my “Node 2” is now the active server and my “Node 1” is no longer participating as it is offline.

image

If I check my inbox, I find that I continue to receive these “TEST Alerts” from BizTalk360.  This time the alerts are coming from my 2nd Node.

image

If we now go back to my 1st Node and start the BizTalk360 Service, we will discover that BizTalk360 has recognized that the service is back online but is in a passive state.

image

Conclusion

I have been around Windows Fail-over Clustering for quite some time and am comfortable working within that environment.  The BizTalk environments that I have used in the past also tend to leverage Windows Failover Clustering in order to support Cluster Host Instances for adapters such as S/FTP, POP3 and Database Polling.  Using Windows Failover Clustering is an option for ensuring BizTalk 360 is online and redundant, but it is not a pre-requisite.  As I have demonstrated in this post; BizTalk360 provides this functionality out of the box.  This is great news, especially for those who have multi-node BizTalk environments but do not have (or need) Windows Failover Clustering.  This allows you piece of mind in the event one of your BizTalk Servers goes offline, that you can have BizTalk360 installed on another node your coverage will not be interrupted.  Kudos to the BizTalk360 team for building such an important feature and making it very easy to use!

European Trip Recap

$
0
0

 

I recently returned back from Europe where I had a chance to participate in two extraordinary events: Bouvet BizTalk Innovation Day (s) in Stavanger, Norway and the 40th running of the Berlin Marathon.

Bouvet BizTalk Innovation Day–Norway Recap

This was a two day event hosted by Bouvet. For those of you who are not familiar with Bouvet, Bouvet provides services in the fields of information technology, digital communication and enterprise management. Bouvet has about 900 employees divided between 14 offices in Norway and Sweden. - See more at: http://www.bouvet.no/en/About-Bouvet/

On day one each of the speakers had the opportunity to present their topic to a crowd of around 70 BizTalk professionals from all over Scandinavia.  The topics ranged from newer technologies like Windows Azure BizTalk Services, Windows Azure Mobile Services, Windows Azure Service Bus and some more universal topics like being proactive when monitoring the health of BizTalk solutions, BizTalk Mapping Patterns, identifying and rescuing a BizTalk Hostage Project and seeing a preview of the next version of BizTalk360.  There was also a special keynote by Paolo Salvatoriwho works for Microsoft Italy and is recognized world wide for his abilities. All presentations were very well received as indicated by the attendee surveys.

My presentation focused on Enterprise Mobility.  This is a topic that I have been dealing with at my day job so I had an opportunity to demonstrate some of the areas of enterprise mobility that I have been thinking about lately.  It was also an opportunity to demonstrate a ‘reference’ application that I have been collaborating on with Mikael Hakansson.

Some of the core principles that I have taken into consideration when dealing with Enterprise Mobility include:

  • Active Directory Federation: When a person leaves the company and their AD account has been disabled, this “tap” should be turned off for other mobile/cloud based services.
  • Leverage a Mobility platform to reduce the diversity required in supporting multiple platforms.  Windows Azure Mobile Services helps us address this by providing APIs for the popular platforms that allow us to centralize activities like Data access, Authentication, Identity Providers, Custom APIs and Scheduled tasks.
  • Most, if not all, Enterprise Mobile apps need to consume Line of Business (LOB) System data.  Windows Azure BizTalk Services (and the BizTalk Adapter Service) allow us a secure way in and out of our enterprise without poking holes in firewalls.  I should note that these capabilities are also available with BizTalk Server 2013.
  • Accessing On-Premise LOB systems isn’t possible (in my scenarios) without the underpinnings of the Windows Azure Service Bus.  Using this technology to span network layers never gets old. The BizTalk Adapter Service has a strong dependency on these services.
  • Data Storage:  Even though I am leveraging SAP master data in this scenario, I do need to maintain the state of the business process.  In this case I am using SQL Azure to host our data.  We can leverage Windows Azure Mobile Services’ APIs that make getting data in and out of the database a breeze.
  • Finally, we can’t forget about Toast Notifications.  We want the ability to send notifications out to users (in this case approvers) and Windows Azure Mobile Services helps us deal with sending Toast Notifications to a variety of platforms. 

Here is one of the scenarios from my demo that illustrates many of the principles that were previously mentioned.

image     

A few screenshots of the application running in the Windows Phone Emulator:

imageimageimageimage

This was one of the more challenging demos that I have ever been involved in.  I had a lot of fun working on this reference app with Mikael and learned a lot in the process.  My complete slide deck can be found here.

Conclusion

Many people work on bringing events alive, but two people who I would like to recognize are Tord Glad Nordahl and Anders Stensland.  They, in addition to the support Bouvet provided, pulled off a fantastic event.  I have had the opportunity to present in Sweden in 2010 and 2011 and I continue to be amazed by the amount of BizTalk interest in Scandinavia.  If you do have the opportunity to attend the Bouvet BizTalk Innovation conference in the future, I highly recommend it.  They did an amazing job.

40th Berlin Marathon

One of my hobbies is running.  I am a pretty average runner but I enjoy the challenges of running and also try to reap the health benefits of staying active.  I have run over 12 half marathons over the past 6 years and finished my first marathon last year in Chicago.  Whenever I have gone to Europe to speak in the past I have always tried to make a side trip within Europe to experience another culture.  In speaking with one of the other presenters (Steef-Jan Wiggers) we had decided that we would head to Berlin after the conference in Norway.  He recommended going to Berlin to experience its rich history.  Having never been to Germany, myself and my wife made plans to join him in Berlin.

I knew that the Berlin Marathon was held in late September.  The Berlin Marathon is one of the 6 major Marathons in the world.  The others include New York, Boston, Chicago, London and Tokyo.  So when I found out that I would be in Berlin on the same day of this historic event, I couldn’t resist the temptation of participating in this event.

The registration deadline had passed but I was able to find a travel agent from Boston who would sell us packages.  With this information, I presented the opportunity to Steef-Jan and he obliged.  He has recently gotten back into running and this would provide a great opportunity to run his first marathon.

The event itself was rather amazing.  Over 42 000 runners participated in the event with an estimated 1 million spectators.  It was an awesome experience and one that I will never forget.  I finished the marathon in 4 hours 34 minutes and 56 seconds which was 4 minutes faster than my Chicago time.

 

A few pictures:

The prize

Medal

 

Before the race.  The garbage bags helped keep us warm while we waited for our turn.

KentBefore

Steef-Jan before the race

SteefBefore

 

After the run

Kent_SteefAfter

 

Celebrating – German style

Celebrating

 

After the race the Adidas store would engrave your time into a running band that was provided as part of your registration.

Timeband

 

 

MVP Profile

One of the best parts of the MVP program is the people you meet and the friendships that you develop.  Without being in the MVP program, this trip would have never happened.  Being part of the program is truly an honor.

Thanks Tord for your hospitality in Norway.  It was a great opportunity to experience my Norwegian heritage and I thoroughly enjoyed your beautiful country. 

Thanks Steef for being an amazing tour guide while in Germany.  Your German came in handy many times and I learned a lot about German history while I was there.  Running the marathon with you was also a great experience.  Next time we won’t do as much sightseeing the day before the race Winking smile.

I also would like to thank the other MVPs (Sandro, Nino, Saravana) and Paolo for a great experience as well.  Talking shop whenever we get together is a lot of fun and always interesting. 

BizTalk360 Product Specialist award

$
0
0

 

This post is long overdue but I felt it was necessary to create.  Back in April, 2013 Saravana Kumar and the BizTalk360 team introduced the BizTalk360 Product Specialist award.  The primary objective of the program is to honour individuals who have gained adequate knowledge in installing, configuring and implementing BizTalk360 solution at customer sites.

I have blogged( here, here and even wrote a whitepaper )  about some of my experiences with BizTalk360 in the past on this blog and am a strong supporter of the product.  I have seen the benefits first hand while leading teams who are responsible for the operational support of busy BizTalk environments.  I have also witnessed the adoption by non-BizTalk experts and seen their productivity increase without being intimidated by larger, complex monitoring solutions. 

Recently I introduced BizTalk to a new organization and this was a tool that would provide immediate benefit.  Sure enough it did, we had a source system experience issues that led to some suspended messages.  The BizTalk team knew about the issues going on in that system before the system owners did.  The end result was that the issues in the source system could be identified and resolved quickly, limiting the disruption to the business.

While I was in Norway, Saravana had a bit of a surprise for me and that was some hardware to keep my MVP awards company. I just wanted to take this opportunity to thank Saravana and the rest of the BizTalk360 team for their recognition and I am looking forward to working with Version 7.0 of the product. I got a sneak peak of the application while in Norway and it looks great.

 

BizTalk360Award

BizTalk Summit 2013 Wrap-up

$
0
0

On November 21st and 22nd I had the opportunity to spend a couple days at the 2nd annual BizTalk Summit held by Microsoft in Seattle.  At this summit there were approximately 300 Product Group members, MVPs, Partners and Customers.  It was great to see a lot of familiar faces from the BizTalk community and talk shop with people who live and breathe integration.

Windows Azure BizTalk Services reaches GA

The Summit started off with a bang when Scott Gu announced that Windows Azure BizTalk Services has reached General Availability (GA)!!!   What this means is that you can receive production level support from Microsoft with 99.9% uptime SLA. 

 

image

During the preview period, Microsoft was offering a 50% discount on Windows Azure BizTalk Services (WABS).  This preview pricing ends at the end of the year.  So if you have any Proof of Concept (POC) apps running in the cloud that you aren’t actively using, please be aware of any potential billing implications.

Release Cadence

The next exciting piece of news coming from Microsoft is the release cadence update for the BizTalk Server product line.  As you have likely realized, there is usually a BizTalk release shortly after the General Availability of Platform updates.  So when a new version of Windows Server, SQL Server or Visual Studio is launched, a BizTalk Server release usually closely follows.  Something that is changing within the software industry is the accelerated release cadences by Microsoft and their competitors.  A recent example of this accelerated release cadence is Windows 8.1, Windows Server 2012 R2 and Visual Studio 2013.  These releases occurred much sooner than they have in the passed.  As a result of these new accelerated timelines the BizTalk Product Group has stepped-up, committing to a BizTalk release every year!  These releases will alternate between R2 releases and major releases.  For 2014, we can expect a BizTalk 2013 R2 and in 2015 we can expect a full release.

BizTalk Server 2013 R2

So what can we expect in the upcoming release?

  • Platform alignment(Windows, SQL Server, Visual Studio) and industry specification updates (SWIFT).
  • Adapter enhancements including support for JSON (Yay!), Proxy support for SFTP and authorization enhancements for Windows Azure Service Bus.  A request I do have for the product team is please include support for Windows Server Service Bus as well.
  • Healthcare Accelerator improvements.  What was interesting about this vertical is it is the fastest growing vertical for BizTalk Server which justifies the additional investments.

image

 

Hybrid Cloud Burst

There were a lot of good sessions but one that I found extremely interesting was the session put on by Manufacturing, Supply Chain, and Information Services (MSCIS).  This group builds solutions for the Manufacturing and Supply Chain business units within Microsoft. You may have heard of a “little” franchise in Microsoft called XBOX.  The XBOX franchise heavily relies upon Manufacturing and Supply chain processes and therefore MSCIS needs to provide solutions that address the business needs of these units.  As you are probably aware, Microsoft has recently launched XBOX One which is sold out pretty much everywhere.  As you can imagine building solutions to address the demands of a product such as XBOX would be pretty challenging.  Probably the biggest hurdle would be building a solution that supports the scale needed to satisfy the messaging requirements that many large Retailers, Manufacturers and online customers introduce.

In a traditional IT setting you throw more servers at the problem.  The issue with this is that it is horribly inefficient.  You essentially are building for the worst case (or most profitable) but when things slow down you have spent a lot of money and you have poor utilization of your resources.  This leads to a high total cost of ownership (TCO). 

Another challenge in this solution is that an ERP is involved in the overall solution.  In this case it is SAP(but this would apply to any ERP) and you cannot expect an ERP to provide the performance to support ‘cloud scale’.  At least not in a cost competitive way. If you have built a system in an Asynchronous manner you can now throttle your messaging and therefore not overwhelm your ERP system.

MSCIS has addressed both of these major concerns by building out a Hybrid solution. By leveraging Windows Azure BizTalk Services and Windows Azure Service Bus Queues/Topics in the cloud they can address the elasticity requirements that a high demand product like XBOX One creates. As demand increases, additional BizTalk Services Units can be deployed so that Manufacturers, Retailers and Customers are receiving proper messaging acknowledgements.  Then On-Premise you can keep your traditional capacity for tools and applications like BizTalk Server 2013 and SAP without introducing significant infrastructure that will not be fully utilized all the time.

Our good friend, Mandi Ohlinger ,who is a technical writer with the BizTalk team, worked with the MSCIS  to document the solution.  You can read more about the solution on the BizTalk Dev Center.  I have included a pic of the high-level architecture below.

image

While Microsoft is a large software company(ok a Devices and Services company) what we often lose sight of is that Microsoft is a very large company (>100 000) employees and they have enterprise problems just like any other company does.  It was great to see how Microsoft uses their own software to address real world needs.  Sharing these types of experiences is something that I would really like to see more of.

Symmetry

(These are my own thoughts and do not necessarily reflect Microsoft’s exact roadmap)

If you have evaluated Windows Azure BizTalk Services you have likely realized that there is not currently symmetry between the BizTalk Service and BizTalk Server.  BizTalk Server has had around 14 years (or more) of investment where as BizTalk Services, in comparison, is relatively new.  Within Services we are still without core EAI capabilities like Business Process Management (BPM)/Orchestration/Workflow, Business Activity Monitoring (BAM), Business Rules Engine (BRE), comprehensive set of adapters and complete management solution.

With BizTalk Server we have a mature, stable, robust Integration platform.  The current problem with this is that it was built much before people started thinking about cloud scale.  Characteristics such as MSDTC and even the MessageBox have contributed to BizTalk being what it is today (a good platform), but they do not necessarily lend themselves to new cloud based platforms.  If you look under the hood in BizTalk Services you will find neither of these technologies in place.  I don’t necessarily see this as a bad thing.

A goal of most, if not all, products that Microsoft is putting is the cloud is symmetry between On-Premise and Cloud based offerings.  This puts the BizTalk team in a tough position.  Do they try to take a traditional architecture like BizTalk Server and push it into the cloud, or built an Architecture on technologies that better lend themselves to the cloud and then push them back on premise? The approach, going forward,  is innovating in the cloud and then bringing those investments back on-premise in the future.

Every business has a budget and priorities have to be set.  I think Microsoft is doing the right thing by investing in the future instead of making a lot of investments in the On-Premise offering that we know will be replaced by the next evolution of BizTalk.  There were many discussions between the MVPs during this week in Seattle on this subject with mixed support across both approaches. With the explosion of Cloud and SaaS applications we need an integration platform that promotes greater agility, reduces complexity and addresses scale in a very efficient manner instead of fixing some of the deficiencies that exist in the current Server platform. I do think the strategy is sound, however it will not be trivial to execute and will likely take a few years as well.

Adapter Eco-system

Based upon some of the sessions at the BizTalk Summit, it looks like Microsoft will be looking to create a larger ISV eco-system around BizTalk Services.  More specifically in the Adapter space.  The reality is that the current adapter footprint in BizTalk Services is lacking compared to some other competing offerings.  One way to address this gap is to leverage trusted 3rd parties to build and make their adapters available through some sort of marketplace. I think this is a great idea provided there is some sort of rigor that is applied to the process of submitting adapters.  I would not be entirely comfortable running mission critical processes that relied upon an adapter that was built by a person who built it as a hobby.  However, I would not have an issue purchasing an adapter in this fashion from established BizTalk ISV partners like BizTalk360 or /nSoftware.

Conclusion

All in all it was a good summit.  It was encouraging to see the BizTalk team take BizTalk Services across the goal line and make it GA.  It was also great to see that they have identified the need for an accelerated release cadence and shared some news about the upcoming R2 release.  Lastly it was great to connect with so many familiar faces within the BizTalk community.  The BizTalk community is not a huge community but it is definitely international so it was great to chat with people who you are used to interacting with over Twitter, Blogs or LinkedIn.

In the event you still have doubts about the future of BizTalk, rest assured the platform is alive and well!

BizTalk 2013–Integration with Amazon S3 storage using the WebHttp Adapter

$
0
0

I have recently encountered a requirement where we had to integrate a legacy Document Management system with Amazon in order to support a Mobile-Field Worker application.  The core requirement is that when a document reaches a certain state within the Document Management System, we need to publish this file to an S3 instance where it can be accessed from a mobile device.  We will do so using a RESTful PUT call.

Introduction to Amazon S3 SDK for .Net

Entering this solution I knew very little about Amazon S3.  I did know that it supported REST and therefore felt pretty confident that BizTalk 2013 could integrate with it using the WebHttp adapter.

The first thing that I needed to do was to create a Developer account on the Amazon platform. Once I created my account I then downloaded the Amazon S3 SDK for .Net. Since I will be using REST technically this SDK is not required however there is a beneficial tool called the AWS Toolkit for Microsoft Visual Studio.  Within this toolkit we can manage our various AWS services including our S3 instance.  We can create, read, update and delete documents using this tool.  We can also use it in our testing to verify that a message has reached S3 successfully.

image

Another benefit of downloading the SDK is that we can use the managed libraries to manipulate S3 objects to better understand some of the terminology and functionality that is available.  Another side benefit is that we can fire up Fiddler while we are using the SDK and see how Amazon is forming their REST calls, under the hood, when communicating with S3

Amazon S3 Accounts

When you sign up for an S3 account you will receive an Amazon Key ID and a Secret Access Key. These are two pieces of data that you will need in order to access your S3 services.  You can think of these credentials much like the ones you use when accessing Windows Azure Services.

image

BizTalk Solution

To keep this solution as simple as possible for this Blog Post, I have stripped some of the original components of the solution so that we can strictly focus on what is involved in getting the WebHttp Adapter to communicate with Amazon S3.

For the purpose of this blog post the following events will take place:

  1. We will receive a message that will be of type: System.Xml.XmlDocument.  Don’t let this mislead you, we can receive pretty much any type of message using this message type including text documents, images and pdf documents.
  2. We will then construct a new instance of the message that we just received in order to manipulate some Adapter Context properties. You may now be asking – Why do I want to manipulate Adapter Context properties?  The reason for this is that since we want to change some of our HTTP Header properties at runtime we therefore need to use a Dynamic Send Port as identified by Ricardo Marques.

    image

    The most challenging part of this Message Assignment Shape was populating the WCF.HttpHeaders context property.  In C# if you want to populate headers you have a Header collection that you can populate in a very clean manner:

    headers.Add("x-amz-date", httpDate);

    However, when populating this property in BizTalk it isn’t as clean.  You need to construct a string and then append all of the related properties together.  You also need to separate each header attribute onto a new line by appending “\n” . 

    Tip: Don’t try to build this string in a Helper method.  \n characters will be encoded and the equivalent values will not be accepted by Amazon so that is why I have built out this string inside an Expression Shape.

    After I send a message(that I have tracked by BizTalk) I should see an HTTP Header that looks like the following:

    <Property Name="HttpHeaders" Namespace="http://schemas.microsoft.com/BizTalk/2006/01/Adapters/WCF-properties" Value=

    "x-amz-acl: bucket-owner-full-control
    x-amz-storage-class: STANDARD
    x-amz-date: Tue, 10 Dec 2013 23:25:43 GMT
    Authorization: AWS <AmazonKeyID>:<EncryptedSignature>
    Content-Type: application/x-pdf
    Expect: 100-continue
    Connection: Keep-Alive"/>

    For the meaning of each of these headers I will refer you to the Amazon Documentation.  However, the one header that does warrant some additional discussion here is the Authorization header.  This is how we authenticate with the S3 Service.  Constructing this string requires some additional understanding.  To simplify the population of this value I have created the following helper method which was adopted from the following post on StackOverflow:

    public static string SetHttpAuth(string httpDate)
         {
              string AWSAccessKeyId = "<your_keyId>";
              string AWSSecretKey = "<your_SecretKey>";

             string AuthHeader = "";
            string canonicalString = "PUT\n\napplication/x-pdf\n\nx-amz-acl:bucket-owner-full-control\nx-amz-date:" + httpDate + "\nx-amz-storage-class:STANDARD\n/<your_bucket>/310531500150800.PDF";
                

             // now encode the canonical string
             Encoding ae = new UTF8Encoding();
             // create a hashing object
             HMACSHA1 signature = new HMACSHA1();
             // secretId is the hash key
             signature.Key = ae.GetBytes(AWSSecretKey);
             byte[] bytes = ae.GetBytes(canonicalString);
             byte[] moreBytes = signature.ComputeHash(bytes);
             // convert the hash byte array into a base64 encoding
             string encodedCanonical = Convert.ToBase64String(moreBytes);
             // finally, this is the Authorization header.
             AuthHeader = "AWS " + AWSAccessKeyId + ":" + encodedCanonical;

             return AuthHeader;
         }

    The most important part of this method is the following line(s) of code:

    string canonicalString = "PUT\n\napplication/x-pdf\n\nx-amz-acl:bucket-owner-full-control\nx-amz-date:" + httpDate + "\nx-amz-storage-class:STANDARD\n/<your_bucket>/310531500150800.PDF";
                

    The best way to describe what is occurring is to borrow the following from the Amazon documentation.

    The Signature element is the RFC 2104HMAC-SHA1 of selected elements from the request, and so the Signature part of the Authorization header will vary from request to request. If the request signature calculated by the system matches the Signature included with the request, the requester will have demonstrated possession of the AWS secret access key. The request will then be processed under the identity, and with the authority, of the developer to whom the key was issued.

    Essentially we are going to build up a string that reflects that various aspects of our REST call (Headers, Date, Resource) and then create a Hash using our Amazon secret.  Since Amazon is aware of our Secret they can decrypt this payload and see if it matches our actual REST call.  If it does – we are golden.  If not, we can expect an error like the following:

    A message sent to adapter "WCF-WebHttp" on send port "SendToS3" with URI http://<bucketname>.s3-us-west-2.amazonaws.com/ is suspended.
    Error details: System.Net.WebException: The HTTP request was forbidden with client authentication scheme 'Anonymous'.
    <?xml version="1.0" encoding="UTF-8"?>
    <Error><Code>SignatureDoesNotMatch</Code><Message>The request signature we calculated does not match the signature you provided. Check your key and signing method.</Message><StringToSignBytes>50 55 54 0a 0a 61 70 70 6c 69 63 61 74 69 6f 6e 2f 78 2d 70 64 66 0a 0a 78 2d 61 6d 7a 2d 61 63 6c 3a 62 75 63 6b 65 74 2d 6f 77 6e 65 72 2d 66 75 6c 6c 2d 63 6f 6e 74 72 20 44 65 63 20 32 30 31 33 20 30 34 3a 35 37 3a 34 35 20 47 4d 54 0a 78 2d 61 6d 7a 2d 73 74 6f 72 61 67 65 2d 63 6c 61 73 73 3a 53 54 41 4e 44 41 52 44 0a 2f 74 72 61 6e 73 61 6c 74 61 70 6f 63 2f 33 31 30 35 33 31 35 30 30 31 35 30 38 30 30 2e 50 44 46</StringToSignBytes><RequestId>6A67D9A7EB007713</RequestId><HostId>BHkl1SCtSdgDUo/aCzmBpPmhSnrpghjA/L78WvpHbBX2f3xDW</HostId><SignatureProvided>SpCC3NpUkL0Z0hE9EI=</SignatureProvided><StringToSign>PUT

    application/x-pdf

    x-amz-acl:bucket-owner-full-control
    x-amz-date:Thu, 05 Dec 2013 04:57:45 GMT
    x-amz-storage-class:STANDARD
    /<bucketname>/310531500150800.PDF</StringToSign><AWSAccessKeyId><your_key></AWSAccessKeyId></Error>

    Tip: Pay attention to these error messages as they really give you a hint as to what you need to include in your “canonicalString”.  I discounted these error message early on and didn’t take the time to really understand what Amazon was looking for. 

    For completeness I will include the other thresshelper methods that are being used in the Expression Shape.  For my actual solution I have included these in a configuration store but for the simplicity of this blog post I have hard coded them.

    public static string SetAmzACL()
        {
            return "bucket-owner-full-control";
        }

        public static string SetStorageClass()
        {
            return "STANDARD";
        }

    public static string SetHeaderDate()
          {
              //Use GMT time and ensure that it is within 15 minutes of the time on Amazon’s Servers
              return DateTime.UtcNow.ToString("ddd, dd MMM yyyy HH:mm:ss ") + "GMT";
             
          }

  3. The next part of the Message Assignment shape is setting the standard context properties for WebHttp Adapter.  Remember since we are using a Dynamic Send Port we will not be able to manipulate these values through the BizTalk Admin Console.

    msgS3Request(WCF.BindingType)="WCF-WebHttp";
    msgS3Request(WCF.SecurityMode)="None";
    msgS3Request(WCF.HttpMethodAndUrl) = "PUT";  //Writing to Amazon S3 requires a PUT
    msgS3Request(WCF.OpenTimeout)= "00:10:00";
    msgS3Request(WCF.CloseTimeout)= "00:10:00";
    msgS3Request(WCF.SendTimeout)= "00:10:00";
    msgS3Request(WCF.MaxReceivedMessageSize)= 2147483647;

    Lastly we need to set the URI that we want to send our message to and also specify that we want to use the WCF-WebHttp adapter.

    Port_SendToS3(Microsoft.XLANGs.BaseTypes.Address)="http://<bucketname>.s3-us-west-2.amazonaws.com/310531500150800.PDF";
    Port_SendToS3(Microsoft.XLANGs.BaseTypes.TransportType)="WCF-WebHttp";

    Note: the last part of my URI 310531500150800.PDF represents my Resource.  In this case I have hardcoded a file name.  This is obviously something that you want to make dynamic, perhaps using the FILE.ReceivedFileName context property.

  4. Once we have assembled our S3 message we will go ahead and send it through our Dynamic Solicit Response Port.  The message that we are going to send to Amazon and Receive back is once again of type System.Xml.XmlDocument
  5. One thing to note is that when you receive a response back from Amazon is that it won’t actually have a message body (this is inline with REST).  However even though we receive an empty message body, we will still find some valuable Context Properties.  The two properties of interest are:

    InboundHttpStatusCode

    InboundHttpStatusDescription

    image

     

  6. The last step in the process is to just write our Amazon response to disk.  But, as we have learned in the previous point is that our message body will be empty but does give me an indicator that the process is working (in a Proof of Concept environment).

Overall the Orchestration is very simple.  The complexity really exists in the Message Assignment shape. 

image

 Testing

Not that watching files move is super exciting, but I have created a quick Vine video that will demonstrate the message being consumed by the FILE Adapter and then sent off to Amazon S3.

 https://vine.co/v/hQ2WpxgLXhJ

Conclusion

This was a pretty fun and frustrating solution to put together.  The area that caused me the most grief was easily the Authorization Header.  There is some documentation out there related to Amazon “PUT”s but each call is different depending upon what type of data you are sending and the related headers.  For each header that you add, you really need to include the related value in your “canonicalString”.  You also need to include the complete path to your resource (/bucketname/resource) in this string even though the convention is a little different in the URI.

Also it is worth mentioning that /n Software has created a third party S3 Adapter that abstracts some of the complexity  in this solution.  While I have not used this particular /n Software Adapter, I have used others and have been happy with the experience. Michael Stephenson has blogged about his experiences with this adapter here.

2013–Year in Review and looking ahead to 2014

$
0
0

With 2014 now upon us I wanted to take some time to reflect on the past year.  It was an incredible and chaotic year but it was also a lot of fun!  Here are some of the things that I was involved in this past year.

MVP Summits

This year there were two MVP summits.  One in February and another at the end of November.  MVP Summits are such great opportunities on a few different levels.  First off you get to hear about what is in the pipeline from product groups but you also get to network with your industry peers. I find that these conversations are so incredibly valuable and the friendships that are developed are pretty incredible.  Over time I have developed an incredible world wide network with so many quality individuals it is actually mind blowing.

(Pictures from February MVP Summit)

MVPSummit1b

 

At the attendee party at Century Link stadium

MVPSummit1

Dinner with Product Group and other MVPs

PGand MVPs

(Pictures from November Summit)

At Lowell’s in the Pike Place Market in Seattle  for our annual Integration breakfast prior to the SeaHawk’s game.

Breakfast

A portion of the Berlin Wall with Steef-Jan at Microsoft Campus

Kent_Steef_BerlinWall

Dinner at our favourite Indian restaurant in Bellevue called Moksha.

Dinner

At Steef Jan’s favorite Donut shop in Seattle prior to the BizTalk Summit.

Donuts

Speaking

This year I had a lot of good opportunities to speak and share some of the things that I have learned.  My first stop was in Phoenix at the Phoenix Connected Systems Group in early May

The next stop was in Charlotte, North Carolina where I presented two sessions at the BizTalk Bootcamp event.  This conference was held at the Microsoft Campus in Charlotte.  Special thanks to Mandi Ohlinger for putting it together and getting me involved.

KentCharlotte

Soon after the Charlotte event I was headed to New York City where I had the opportunity to present at Microsoft Technology Center (MTC) along side the Product group and some MVPs to some of Microsoft’s most influential customers in New York City.

New York City

The next stop on the “circuit” was heading over to Norway to participate in the Bouvet BizTalk Innovation Days conference.  This was my favourite event for a few reasons;

  • I do have some Norwegian heritage so it was a tremendous opportunity to learn about my ancestors.
  • Another opportunity to hang with my MVP buddies from Europe
  • I don’t think there is a more passionate place on the planet about integration than in Scandinavia (Sweden included).  Every time I have spoke there I am completely overwhelmed by the interest in Integration in that part of the world.

Special thanks to Tord Glad Nordahl for including me in this event.

NorwaySpeakers2

After the Norway event I had the opportunity to participate in the 40th Annual Berlin Marathon with my good friend Steef Jan Wiggers. This was my second Marathon that I have run and it was a tremendous cultural experience to run in that city.  I also shaved 4 minutes off of my previous time from the Chicago marathon so it was a win-win type of experience.

Celebrating

The last speaking engagement was in Calgary in November.  I had the opportunity to speak about Windows Azure Mobile Services, Windows Azure BizTalk Services and SAP integration at the Microsoft Alberta Architect forum.  It was a great opportunity to demonstrate some of these capabilities in Windows Azure to the Calgary community.

Grad School

2013 also saw me returning to School! I completed my undergrad degree around 12 years ago and felt I was ready for some higher education.  I have had many good opportunities for career growth in my career but always felt that it was my technical capabilities that created those leadership and management opportunities.  At times I felt like I didn’t have a solid foundation when it came to running parts of an IT organization.  I felt that I could benefit from additional education.  I don’t ever foresee a time when I am not involved in Technology.  It is my job but it is also my hobby. With this in mind I set out to find a program that focused on the “Management of Technology”.  I didn’t want a really technical Master’s program and I also didn’t want a full blown Business Master’s program.  I really wanted a blend of these types of programs.  After some investigation I found a program that really suited my needs.  The program that I landed on was Arizona State University’s MSIM (Masters of Science in Information Management) through the W.P. Carey School of Business.

In August, 2013, I headed down to Tempe, Arizona for Student Orientation.  During this orientation myself and 57 other students in the program received detailed information about the program.  We also got assigned into groups of 4 or 5 people who you will be working closely with over the course of the 16 month program.  There are two flavors of the program.  You can either attend in-person at the ASU campus or you can participate in the on-line version of the program.  With me living in Calgary, I obviously chose the remote program. 

One thing that surprised me was the amount of people from all over the United States that are in this program.  There are people from Washington St, Washington DC, Oregon, California, Colorado, New Mexico, Texas, Indiana, New York, Georgia, Vermont, Alabama, Utah and of course Arizona in the program. When establishing groups, the school will try to place you in groups within the same time zone.  My group consists of people from Arizona which has worked out great so far.  This is really a benefit of the program as everyone brings a unique experience to the program which has been really insightful.

I just finished up my 3rd course (of 10) and am very pleased with choosing this program.  Don’t get me wrong, it is a lot of work but I am learning alot and really enjoying the content of the courses.  The 3 courses that I have taken so far are The Strategic value of IT, Business Process Design and Data and Information Management.  My upcoming course is on Managing Enterprise Systems which I am sure will be very interesting.

If you have any questions about the program feel free to leave your email address in the comments as I am happy to answer any questions that you have.

388644_10151308828386207_698535131_n

 

 Books

Unfortunately this list is going to be quite sparse compared to the list that Richard has compiled here, but I did want to point out a few books that I had the opportunity to read this past year.

Microsoft BizTalk ESB Toolkit 2.1

In 2013, it was a slow year for new BizTalk books.  In part due to the spike in books found in 2012 and also the nature of the BizTalk release cycle. However we did see the Microsoft BizTalk ESB Toolkit 2.1  book being released by Andres Del Rio Benito and Howard Edidin. 

This book comes in Packt Publishing’s new shorter format.  Part of the challenge with writing books is that it takes a really long time to get the product out.  In recent years Packt has tried to shorten this release cycle and this book falls into this new category.   The book is approximately 130 pages long and is the most comprehensive guide of the ESB toolkit available.  I have not seen another resource where you will find as much detailed information about the toolkit.

Within this book you can expect to find 6 chapters that discuss:

  • ESB Toolkit Architecture
  • Itinerary Services
  • ESB Exception Handling
  • ESB Toolkit Web Services
  • ESB Management Portal
  • ESB Toolkit Version 2.2 (BizTalk 2013) sneak peak.

If you are doing some work with the ESB toolkit and are looking for a good resource then this a good place to start. (Amazon)

ESB Book

 

The Phoenix Project: A Novel about IT, DevOps and Helping your Business Win

I was made aware of this book via a Scott Gu tweet and boy it was worth picking up.  This book reads like a novel but there are a lot of very valuable lessons embedded within the book.  This book was so relevant to me that I could have sworn that I have worked with this author before because I had experienced so much of what was in this book.  If you are new to a leadership role or are struggling in that role this book will be very beneficial to you. (Amazon)

The Phoenix Project

 

Adventures of an IT Leader

This is a book that I read as part of my ASU Strategic Value of IT course.  It is similar in nature to the Phoenix Project and also reads like a novel.  In this case a Business Leader has transitioned into a CIO position.  This book takes you through his trials and tribulations and really begs the question is “IT Management just about Management”. (Amazon)

IT Leadership

The Opinionated Software Developer: What Twenty-Five Years of Slinging Code Has Taught Me

This was an interesting read as it describes Shawn Wildermuth’s experiences as a Software Developer.  It was a quick read but was really interesting to learn about Shawn’s experiences throughout his career. I love learning about what other people have experienced in their careers and this provided excellent insight into Shawn’s. (Amazon)

Shawn

Hard Facts, Dangerous Half-Truths, and Total Nonsense: Profiting from Evidence-based Management

Another book from my ASU studies but this one was interesting.  It does read more like a text book but the authors are very well recognized for their work in Business Re-engineering space.  I think the biggest thing that I got out of this book was to not lose sight of evidence-based management. All too often technical folks use their previous experiences to dictate future decisions.  For example at a previous company or client a particular method worked.  However taking this approach to a new company or client provides you no guarantees that it will work again.  This book was a good reminder that a person needs to stick to the facts when making decisions and to not rely (too much) on what has worked (or hasn’t) in the past. (Amazon)

Hard Facts

 

 2014

Looking ahead I expect 2014 to be as chaotic and exciting as 2013.  It has already gotten off to a good start with Microsoft awarding me with my seventh consecutive MVP award in the Integration discipline.  I want to thank all of the people working in the Product Group, the Support Group and in the Community teams for their support.  I also want to thank my MVP buddies who are an amazing bunch of people that I really enjoy learning from.

MVP_FullColor_ForScreen

Also, look for a refresh of the (MCTS): Microsoft BizTalk Server 2010 (70-595) Certification Guide book. No the exam has not changed, but the book has been updated to include BizTalk 2013 content that is related to the Microsoft BizTalk 2013 Partner competency exam.  I must stress that this book is a re-fresh so do not expect 100% (of anywhere near that) of new content.

European Tour 2014

$
0
0

As I look at the calendar and see some important dates are quickly approaching, I thought I better put together a quick blog post to highlight some of the events that I will be speaking at in early March.

I will be using the same content at all events but am happy to talk offline about anything that you have seen in this blog or my presentation from Norway this past September.

The title of my session this time around is: Exposing Operational data to Mobile devices using Windows Azure and here is the session’s abstract:

In this session Kent will take a real world business scenario from the Power Generation industry. The scenario involves real time data collection, power generation commitments made to market stakeholders and current energy prices. A Power Generation company needs to monitor all of these data points to ensure it is maintaining its commitments to the marketplace. When things do not go as planned, there are often significant penalties at stake. Having real time visibility into these business measures and being notified when the business becomes non-compliant becomes extremely important.
Learn how Windows Azure and many of its building blocks (Azure Service Bus, Azure Mobile Services) and BizTalk Server 2013 can address these requirements and provide Operations people with real time visibility into the state of their business processes.

London – March 3rd and March 4th

The first stop on the tour is London where I will be speaking at BizTalk360’s BizTalk Summit 2014.  This is a 2 day paid conference event which has allowed BizTalk360 to bring in experts from all over the world to speak at this event.  This includes speakers from Canada (me), my neighbor, the United States, Italy, Norway, Portugal, Belgium, the Netherlands and India.  These experts include many Integration MVPs and the product group from Microsoft.

There are still a few tickets available for this event so I would encourage you to act quickly to avoid being disappointed.  This will easily be the biggest Microsoft Integration event in Europe this year with a lot of new content.

londonbanner

Stockholm – March 5th

After the London event, Steef-Jan Wiggers and I will be jumping on a plane and will head to Stockhom to visit our good friend Johan Hedberg and the Swedish BizTalk Usergroup.  This will be my third time speaking in Stockholm and 4th time speaking in Scandinavia.  I really enjoy speaking in Stockholm and am very much looking forward to returning to Sweden.  I just really hope that they don’t win the Gold Medal in Men’s Hockey at the Olympics otherwise I won’t hear the end of it.

I am also not aware of any Triathlons going on in Sweden at this time so I should be safe from participating in any adventure sports.

At this point an EventBrite is not available but watch the BizTalk Usergroup Sweden site or my twitter handle (@wearsy) for more details. 

icy-harbour-stockholm

Netherlands – March 6th

The 3rd and last stop on the tour is the Netherlands where I will be speaking at the Dutch BizTalk User Group.  Steef-Jan Wiggers will also be speaking as will René Brauwers.  This will be my second trip to the Netherlands but my first time speaking here. I am very much looking forward to coming back to the region to talk about integration with the community and sample Dutch Pancakes, Stroopwafels and perhaps a Heineken (or two).

The eventbrite is available here and there is no cost for this event.

amsterdam

See you in Europe!


Learning Mule ESB

$
0
0

 

I recently joined MuleSoft and have had quite a few people ask me how they can get started with the platform.  These people typically have some integration experience on other technology stacks and are curious about the buzz that MuleSoft is creating in the industry.  Instead of copying and pasting from email to email I figured that I would put together a blog post that identifies some beneficial resources for learning Mule ESB.  I will try to keep this post up to date as new material emerges.

MuleBefore getting started with any of the learning resources, you will need to download the Mule ESB platform.  MuleSoft provides a free community edition that allows you to build and run Mule Applications. 

In addition to a Community Edition (CE), a commercial product called Enterprise Edition (EE) also exists that provides some additional Enterprise features.

The Bits

Mule ESB Community Edition
– Free download of the community edition of the software including the Mule AnyPoint Studio IDE for developing interfaces for the Mule Platform.  These tools can be run on Windows, Mac and Linux.
 
Tutorials

First 30 Minutes with Mule
– An introduction to the platform and simple walkthrough of your first Mule Application.

First Day with Mule
– Some more concepts are introduced including Message States, Global Elements, Content Based Routing and Connector Tutorials.

First Week with Mule
– Some more advanced concepts are introduced including Application Architecture, Security and Extending Mule.
 
Video Clips/Webcasts

Mule 101: Rapidly Connect Anything, AnywhereDiscover the MuleSoft’s Data Mapper, 120+ out of box connectors, development tools and deployment.

Mule 201: Develop and manage a hybrid integration applicationLearn about Legacy Modernization, Service Orchestration and Connectors. Also learn to deploy your Mule Applications and Manage/Monitor them through the Mule Management Console.

MuleSoft’s YouTube Channel– Find a lot of short demonstrations and promotional material.  Demonstrations include SAP, SalesForce, Marketo, LinkedIn, Amazon S3, Hadoop, Netsuite, Twitter and many more.
 
Books 

These are the books that I have read.  I can confidently say that I learned something from each of them.

Mule ESB Cookbook– I started with this one and found some easy to follow, walk-throughs of common integration scenarios.

Getting Started with Mule Cloud Connect– This book focuses more on the Cloud and SaaS connectors.  It is a good read, but I would suggest getting some more fundamental learning taken care of first then dig into these topics.

Mule in Action Second Edition- This is the most comprehensive book of the 3.  It gets into a greater level of detail than the cookbook and walks you through some rich examples.
 
Blog Posts

Here are few walkthroughs that I put together as I began my Mule journey.

Exposing Simple REST Service– As the title suggests, a simple REST Service.

Exposing SQL Server Data as HTTP Endpoint– This post will demonstrate how to expose SQL Operations through HTTP and return responses in JSON format.

Exploring Mule ESB SFTP Adapter– Since I have used the SFTP adapter on other platforms I was curious to take a peek at MuleSoft’s solution.

Twitter Integration– A quick look at the MuleSoft Twitter connector that allows you to interact with the Twitter API in a very eloquent way.  In this example I update my Twitter status via Mule ESB.
 
.Net Resources

On this blog you will without doubt find a lot of Microsoft related content.  MuleSoft is a company that is driven to connect any system to any device on any platform.  With this in mind there are some activities in the pipeline to better support .Net and other Microsoft products/services like SharePoint, Dynamics, Azure etc.  With this in mind, I figured I would include a few links that may be of interest to people who are interesting in integrating Microsoft technologies.

Connect .NET to anything, anywhere  - Whitepaper

.NET Connectivity– Article 

In addition to what you will find in those articles here are some of the ways that Mule ESB integrates with Microsoft technologies.

Mule ESB Anypoint Connectors for Microsoft platforms

  • MSMQ
  • AMQP
  • Active Directory
  • SOAP/WS-* (WCF interoperability)
  • REST (ASP.NET WebAPI interoperability)
  • SharePoint
  • SQL Server
  • Microsoft Dynamics GP
  • Dynamics CRM
  • Dynamics Online
  • Excel/CSV
  • Yammer

 

 

Lastly, I wanted to mention an upcoming event in San Francisco where you will be able to learn more about the .Net investments and other areas of focus for MuleSoft.  Click on the image below for more details.

image

Speaking at Connect 2014

$
0
0

On May 27th-29th, in San Francisco, MuleSoft is hosting one of the largest integration events of the year called Connect.  There is an impressive list of speakers including:

  • Greg Schott, CEO MuleSoft
  • Ross Meyercord, CIO Salesforce
  • John Collison, Co-Founder Stripe
  • Ross Mason, Founder MuleSoft
  • Ben Haines, CIO BOX
  • Uri Sarid, CTO MuleSoft

I am fortunate to have the opportunity to participate in the event.  I am speaking in the Insurance track and also co-presenting on Integrating heterogeneous environments.

In the Integrating heterogeneous environments session will be demonstrating some of the investments that MuleSoft has been making in the area of Microsoft integration and more specifically some of the work we have been doing with WCF and .Net.

Without giving up too many details, there will be a healthy dose of RAML, APIs, SalesForce, .Net, REST/JSON,  WCF and Mobile in the demos. 

image

So if you haven’t signed up, there are still some spots available.  You can do so here.

My MuleSoft Connect 14 Recap

$
0
0

This past week over 1200 attendees ascended upon San Francisco to attend Connect and APICon (a sister conference).  If you are interested in checking out the company recap of Connect you can check that out here.  The purpose of this post is to highlight some of my thoughts from the event.

This was my first time attending Connect and I had a great time chatting with attendees, customers and other Muleys about everything currently going on in the industry.  I also had the opportunity to co-present in two sessions. Whenever I speak at a conference like this I generally provide a post conference write up and figured I would take this opportunity to share some of the details from my sessions.

The Connected Insurer

In this session I shared the stage with Aaron Landgraf (Product Marketing Manager) where we discussed some of the themes that are taking place in today’s Insurance Company.

The first point is that the Insurance landscape is changing:

  • Organizations are trying to differentiate themselves via Customer Service.  This may include the ability for customer self service via the Web or Mobile.
  • Omni (or multi) Channel engagements.  No longer are customer interactions going to occur within one particular channel but may start over social media, continue over email and conclude through the web.
  • Lastly there should be one view (or 360 degree) view of the customer.  Regardless of the system that is being used to service the customer, all contextual information needs to be made available within that view.  This information may be pulled from a variety of systems in real time and aggregated while setting the appropriate context.

image

Another aspect of the presentation was to discuss some of the current challenges within this industry vertical.

  • Customers are looking for improved customer service while organizations are trying to reduce costs.
  • Any new solution will likely need to interface with existing (or legacy) assets.
  • Delivering solutions needs to be done in a very timely manner.

image

Next I discussed a high level architecture where we can address some of these challenges by providing RESTful APIs modeled in RAML, A Service Orchestration layer hosted by Mule ESB that is capable of orchestrating back end services.

image

An actual implementation of this architecture can be found in the next image.  More on this solution towards the bottom of the post.

image

Integrating the Heterogeneous Enterprise

The next session I co-presented with Ken Yagen (VP Products) where Ken walked us through some trends occurring within the industry including:

  • 36% of IT’s time is spent maintaining legacy systems and 67% of global execs said IT will spend the same or more time on these systems.
  • Cloud Services are used to augment current services.  65% of CIOs have deployed cloud solutions as a way to bolster existing services.
  • Enterprise Mobility: 45% of organizations will spend at least 500K on mobility projects over the next 12 to 18 months.

The reality of the current state that we are in is that organizations are being forced to integrate with heterogeneous environments.  No longer can organizations say we are an “ABC” shop or an “XYZ” shop.  While Enterprise Architectures may push for reducing technical diversity, the increasing adoption of Mobility, SaaS, Cloud and Social platforms are disrupting traditional architectures.  As a result a flexible, yet comprehensive platform is required to address these needs.  At MuleSoft, we feel that our AnyPoint Platform is the answer when integrating heterogeneous environments.

image

Demo

I put together a demo that addressees a lot of the challenges and trends that were discussed across these two presentations including:

  • Mobility
  • RESTful APIs (modeled in RAML)
  • Service Orchestration via MuleESB
  • Integrating Heterogeneous systems (SQL Server, WCF Services, .NET and SalesForce)
  • Also took the opportunity to introduce some of the new features of the AnyPoint Platform May 2014 release including Scatter-Gather, Web Services Consumer and enhanced DB connector which supports Data Sense)

Here are a few of the screen shots from the Windows 8 Mobile App that was built using C#/XAML.

image

 

image

image

image

image

image

RAML Definition

image

AnyPoint Studio – MuleESB solution (well part of it – didn’t fit in one screenshot)

image

Conclusion

Overall it was a great week with a lot of interesting sessions and conversations. Whether you attended it or were unable to attend,  the good news is that the London edition of Connect is right around the corner from September 24 – 26th. You can sign up for more details here.

MuleSoft dot NET

$
0
0

For those of you who have been keeping your AnyPoint Studio up to date you may have been pleasantly surprised this week.  The reason?  MuleSoft released two important connectors for customers who leverage the Microsoft platform in their architectures.  (You can read more about the official announcement here.)

More specifically, the two capabilities that were released this week include:

  • .NET Connector
  • MSMQ Connector

image

The MSMQ Connector is self explanatory but what is a .NET Connector?  The .NET Connector allows .NET code to be called from a Mule Flow.

Why are these connectors important? For some, hating Microsoft is a sport, but the reality is that Microsoft continues to be very relevant in the Enterprise.  In case you missed their recent earnings, they made 4.6 billion in net income for their past quarter…yes that is a ‘b’ and yes that was only for a quarter of the year. 

Many customers continue to use MSMQ.  Sometimes these solutions are custom .Net solutions where they are using MSMQ to add some durability for their messaging needs.  Sometimes, these are legacy applications in maintenance mode but not always.  Other use cases include purchasing a COTS (Commercial Off The Shelf) product that has a dependency on MSMQ.

While the MSMQ Connector is a nice addition to the MuleSoft portfolio of Connectors, the .NET Connector is what really gets me excited.  I have been using .Net since the 1.1 release and am very comfortable in Visual Studio.

For many organizations, they have standardized on building their custom applications in .NET.  I have worked for these companies in the past and for many of these organizations, programming in another language is a showstopper.  There may be concerns about re-training, interoperability and productivity as a result of introducing new programming languages. Some people may consider this fear mongering, but the reality is if you have a strong Enterprise Architecture practice, you need to adhere to standards. While some people are willing to introduce many different languages into an environment,  others are not.

The combination of the AnyPoint Platform and the ability to write any integration logic that is required in .NET is a very powerful combination for organizations that want to leverage their .NET skill sets.

How to invoke .NET Code from a Mule Flow? There are many resources being made available as part of this release so I don’t want to spoil that party (See conclusion for more resources).  But let me provide a sneak peak. For those of you who may not be familiar with MuleSoft, we have the ability to write Mule Flows.  You can think of these much like a Workflow or an Orchestration for my BizTalk friends.  On the right hand side we have our pallete where we can drag Message processors or Connectors from the pallete to our Mule Flow.

image

Once our Connector is on our Mule Flow, we can configure it.  We need to provide an Assembly Type, Assembly Path (can be relative or absolute), a Scope and a Trust level.  This configuration is considered to be a Global Element and we only have to configure this once per .NET assembly.

image

Next we provide the name of the .NET Method that we want to call.

image

From there it is business as usual from a .NET perspective.  I can send and receive complex types, JSON, XML Documents etc.

image

Conclusion

Hopefully this gives you a little taste of what is to come.  I have had the opportunity to work with many Beta customers on this functionality and am very excited with where we are and where we are headed.  What we are releasing now is just the beginning.

Stay tuned for more details on both the MSMQ and .NET Connectors.  Now that these bits are public I am really looking forward to sharing this information with both the Microsoft and MuleSoft communities.

Other resources:

  • Press Release
  • MuleSoft Blog Post including two short video demos and registration link for an upcoming Webinar.

BTW: If this sounds interesting to you, we are hiring!!!

Windows Azure 1.8 SDK– Queue to Queue Transfers

$
0
0

I was recently watching an episode of the Cloud Cover show where Abhishek Lal was talking about some of the recent features that are available in SDK 1.8.  One of the features that stood out for me was Queue to Queue transfers. A Queue to Queue transfer allows for a publisher to push content to a Queue.  Next, the Queue that just received the message can now push it out to another Queue automatically.

This new capability supports a couple design patterns:

  • Fan In – Where you have multiple systems publishing messages and you want to reduce the receiving endpoint surface down to a smaller number.
  • Fan Out – Where you have a single source message that you want to disseminate to more parties

The focus of this post is the Fan In scenario.  The diagram below describes the messaging pattern that we would like to enforce.  In this case we have 4 publishers.  Let’s pretend these are retailers who are now requesting more supply of a particular product.  If we want isolation between publishers then we would create a queue for each Publisher.  However, if we are interested in order delivery we now have race conditions that exist on the Receiver side.  Since this is a BizTalk Blog, BizTalk is acting as my Receiver.  Since we have 4 queues with 4 unique URIs this translates into 4 BizTalk Receive Locations (the blue RL boxes below).  We do not have any control over when and how those messages are received.  In my opinion this problem gets worse if we are building our own .Net client that is checking each queue looking for new messages.  Even if we are trying to be “fair” about the way in which we check the queues for new messages we don’t have any assurances of in order delivery.

image

Let’s make our lives easier and let the Service Bus maintain the order of messages through Queue to Queue  transfers and have a single endpoint that we need to consume from.  It will also simplify our BizTalk configuration as we will only need 1 Receive Location.

image

 

Solution

Within Visual Studio I am going to create a Solution that has 3 C# console applications.

image

QueueToQueuePublisher Project

The core of the overall solution is the QueueToQueuePublisher project.  Within it there are two classes:

  • DataContracts.cs -  contains our class that we will use as our PurchaseOrder
  • Program.cs -  is where we will create our Queues and establish our Queue to Queue forwarding.

image

DataContracts Class

If we further examine the DataContracts class we will discover the following object:

namespace QueueToQueuePublisher
{
    public class PurchaseOrder
    {
        public string ProductID { get; set; }
        public int QuantityRequired { get; set; }
        public string CompanyID { get; set; }
    }
}

 

Program Class

In Program.cs things get a little more interesting


using System;
using System.Collections.Generic;
using Microsoft.ServiceBus;
using Microsoft.ServiceBus.Messaging;

namespace QueueToQueuePublisher
{
    public class Program
    {
        const string CommonQueueName = "CommonQueue";
        const string PubAQueueName = "PubAQueue";
        const string PubBQueueName = "PubBQueue";
        const string ServiceNamespace = "<your_namespace>";
        const string IssuerName = "owner";
        const string IssuerKey ="<your_key>";

        static void Main(string[] args)
        {

            TokenProvider credentials = TokenProvider.CreateSharedSecretTokenProvider(Program.IssuerName, Program.IssuerKey);
            Uri serviceUri = ServiceBusEnvironment.CreateServiceUri("sb", Program.ServiceNamespace, string.Empty);

            try
            {
                //*************************************************************************************************
                //                                   Management Operations
                //**************************************************************************************************         
                NamespaceManager namespaceClient = new NamespaceManager(serviceUri, credentials);
                if (namespaceClient == null)
                {
                    Console.WriteLine("\nUnexpected Error: NamespaceManager is NULL");
                    return;
                }

                Console.WriteLine("\nCreating Queue '{0}'...", Program.CommonQueueName);

                // Create Queue if it doesn't exist.
                //This Queue must exist prior to another
                //Queue forwarding messages to it
                if (!namespaceClient.QueueExists(Program.CommonQueueName))
                {
                    namespaceClient.CreateQueue(Program.CommonQueueName);
                }

                // Create Publisher A's Queue if it doesn't exist
                if (!namespaceClient.QueueExists(Program.PubAQueueName))
                {
                    QueueDescription qd = new QueueDescription(Program.PubAQueueName);

                    //This is where we establish our forwarding
                    qd.ForwardTo = Program.CommonQueueName;
                    namespaceClient.CreateQueue(qd);
                }

                // Create Publisher B's Queue if it doesn't exist
                if (!namespaceClient.QueueExists(Program.PubBQueueName))
                {
                     QueueDescription qd = new QueueDescription(Program.PubBQueueName);
                    {
                         //This is where we establish our forwarding

                         qd.ForwardTo = Program.CommonQueueName;
                         namespaceClient.CreateQueue(qd);
                    };
                  
                }

            }
            catch (Exception ex)
            {
                Console.WriteLine(ex.ToString());
            }
        }
    }
}

 

Within this class, the purpose is to:

  1. Create the Common Queue, if it does not already exist.
  2. If Publisher A’s Queue does not exist, create a new Queue Description and include the ForwardTo directive that will forward messages from the Publisher A Queue to the Common Queue. We will then use this Queue Description to create the Publisher A Queue.
  3. If Publisher B’s Queue does not exist, create a new Queue Description and include the ForwardTo directive that will forward messages from the Publisher B Queue to the Common Queue. We will then use this Queue Description to create the Publisher B Queue.

The Program.cs class that is part of this project only needs to run once in order to setup and configure our queues.

 

Publisher A Project

The purpose of this Project is very simple.  We want to create an instance of a Purchase Order and publish this message to our Publisher A Queue.

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using Microsoft.ServiceBus;
using Microsoft.ServiceBus.Messaging;
using QueueToQueuePublisher;
using System.Runtime.Serialization;


namespace PublisherA
{
    class Program
    {

            const string SendQueueName = "pubaqueue";
            const string ServiceNamespace = "<your_namespace>";
            const string IssuerName ="owner";
            const string IssuerKey = "<your_key>";
      
               
       static void Main(string[] args)
        {

          

     
            //***************************************************************************************************
            //                                   Get Credentials
            //***************************************************************************************************          
            TokenProvider credentials = TokenProvider.CreateSharedSecretTokenProvider  (Program.IssuerName, Program.IssuerKey);
            Uri serviceUri = ServiceBusEnvironment.CreateServiceUri("sb", Program.ServiceNamespace, string.Empty);

            MessagingFactory factory = null;

            try
            {
                PurchaseOrder po = new PurchaseOrder();

                po.CompanyID = "PublisherA";
                po.ProductID = "A1234";
                po.QuantityRequired = 300;

                factory = MessagingFactory.Create(serviceUri, credentials);

                QueueClient myQueueClient = factory.CreateQueueClient(Program.SendQueueName);

                BrokeredMessage message = new BrokeredMessage(po, new DataContractSerializer(typeof(PurchaseOrder)));
                Console.WriteLine("Publisher A sending message");
                myQueueClient.Send(message);


            }
            catch(Exception ex)
            {
                Console.WriteLine(ex.ToString());
            }

        }
    }
}

Publisher B Project

This project is pretty much a carbon copy of the Publisher A Project with the difference being that we are going to send messages to our Publisher B Queue instead of the Publisher A Queue.  I have included this project for completeness.

 

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using Microsoft.ServiceBus;
using Microsoft.ServiceBus.Messaging;
using QueueToQueuePublisher;
using System.Runtime.Serialization;


namespace PublisherA
{
    class Program
    {

        const string SendQueueName = "pubbqueue";
        const string ServiceNamespace = "<your_namespace>";
        const string IssuerName = "owner";
        const string IssuerKey = "<your_key>";


        static void Main(string[] args)
        {

 


            //***************************************************************************************************
            //                                   Get Credentials
            //***************************************************************************************************          
            TokenProvider credentials = TokenProvider.CreateSharedSecretTokenProvider(Program.IssuerName, Program.IssuerKey);
            Uri serviceUri = ServiceBusEnvironment.CreateServiceUri("sb", Program.ServiceNamespace, string.Empty);

            MessagingFactory factory = null;

            try
            {
                PurchaseOrder po = new PurchaseOrder();

                po.CompanyID = "PublisherB";
                po.ProductID = "B1234";
                po.QuantityRequired = 300;

                factory = MessagingFactory.Create(serviceUri, credentials);

                QueueClient myQueueClient = factory.CreateQueueClient(Program.SendQueueName);


                BrokeredMessage message = new BrokeredMessage(po, new DataContractSerializer(typeof(PurchaseOrder)));
                Console.WriteLine("Publisher B sending message");
                myQueueClient.Send(message);


            }
            catch (Exception ex)
            {
                Console.WriteLine(ex.ToString());
            }

        }
    }
}

BizTalk

From a BizTalk perspective we are going to keep this very simple.  We will simply create a Send Port Subscription and write a copy of any message that is retrieved from the Common Service Bus Queue to disk.

In order to configure a Send Port Subscription we need to first create a Receive Port and Receive Location.  This Receive Location will connect to our ServiceBus Namespace and will be looking for messages form the CommonQueue only.  As you may recall the messages that are sent to the individual Publisher A and B Queues will be forwarded to this Common Queue. 

Also note, since I am not deploying any Schemas we want to use the PassThruReceive pipeline.  If you specify XMLReceive then BizTalk will be looking for a schema that doesn’t exist.

image

Our Send Port will consist of using the FILE Adapter to write our messages to the local file system.

image

In order for our Send Port Subscription to work we do need to create a Filter based upon our Receive Port Name.

image

At this point we can enable our BizTalk Application.

 

Testing

In order to test our application we do need to make sure that the QueueToQueuePublisher console application is run.  This will create our common queue and our two publisher queues.  After running this application we should see the following within our namespace.

image

If we want to test our Queue To Queue forwarding we can simply create a test message in our pubaqueue and then receive the test message from our commonqueue.

image

image

image

Now that our Queue configuration has been verified we can run an instance of our PublisherA console application.

image

If we check our file folder that our send port is writing to we should see a new file has been written.

image

We can now perform the same actions with PublisherB.

image

image

Conclusion

As you can see the Queue to Queue forwarding is a pretty neat feature.  We can use it for Fan In, as in our scenario, messaging scenarios that forces Service Bus to worry about in order delivery and simplifies a Receiver’s endpoint surface.  Arguably it creates more configuration in the cloud so there may be a bit of a trade off in that regard.

Currently the only way to configure the ForwardTo property is through the Management APIs.  There is currently no GUI that allows you to take care of this.  But, without having any private knowledge, I am sure that this is something that Microsoft will address in a future update.

Something else to be aware of is that BizTalk has no understanding of ForwardTo.  Nor would any other client of the Service Bus.  This “configuration” is occurring outside of client configurations which is the way it should be.  Any perceived complexities that exist should be abstracted away from systems that are communicating with Service Bus.

WCF-SAP User account not in validity date

$
0
0

Recently, I have been working with a new organization and have been been building some interfaces that communicate with SAP. Over the Christmas break I continued to work on some of these interfaces.  On December 31 my interfaces were working just fine.  But, on January 1st, I made a few changes and then ran my BizUnit regression tests just to make sure everything was ok.  To my surprise, my tests were failing and the issue was related to the following error.

SAPError

It just seemed like too much of a co-incidence that the errors started occurring on the first day of the new year. My gut told me that my account must have expired. At my previous organization the SAP-BizTalk system accounts never expired but they do at this one(which is probably not a bad thing).  The resolution in this case was for the SAP Basis team to update the validity date for the account.  Once this attribute was updated I could re-connect to SAP through my interfaces.  I have no idea why the SAP error message doesn’t just say “your account has expired”.

Book Review - Getting Started with BizTalk Services

$
0
0

 

I recently read the Getting Started with BizTalk Services book and decided to blog about my experience with it.  I have test drove earlier versions of BizTalk Services (blog, blog,  and blog) and decided that I wanted to catch up on some of the more recent developments in this space.  Reading this book was a great way to get this type of information from one source.

Let’s first start off with the authors; Jon Fancey and Karthik Bharathy.  Both Jon and Karthik are BizTalk veterans and are both very well respected in the community.  I knew picking up the book that the quality would not be a concern.

The book does not require a reader to possess extensive BizTalk Server experience.  It does help when drawing comparisons between features in Server vs Services.  Even without a lot of Microsoft Integration experience, a reader can be very productive working through the examples in this book.

image

Within the book, you will discover 8 chapters covering 156 pages.  Each chapter contains some contextual background information followed by easy to follow examples that include:

  • Introduction to BizTalk Services
  • Messages and Transforms
  • Bridges
  • Enterprise Application Integration
  • Business to business Integration
  • Management APIs
  • Tracking and Troubleshooting
  • Moving to BizTalk Services

Even though I have some experience with the BizTalk Services beta and its predecessors I did learn some things from this book.  Probably most valuable chapter for me was Chapter 2 – Messages and Transforms.  While I have used the new “mapper” in BizTalk Services, there were certainly some operations that I haven’t used before including:

  • List Operations
  • Get Context Properties inside a map
  • If then else operation
  • Transform Exception Settings.

Another useful chapter for me was Chapter 5 – Business to business Integration.  Even though the BizTalk Services platform is rather young, the BizTalk Services’ EDI capabilities are known to be one of the strengths of the platform.  Since I have not done much in the EDI space, this chapter acted as an EDI Primer and then was able to relate these EDI concepts to the BizTalk Services solution.

Lastly, the Management API chapter was interesting as well.  I have seen Steef-Jan present on the topic, but it was nice to be able to read through some examples on how you can manage your BizTalk Services application.

Conclusion

In the final chapter; Chapter 8 – Moving BizTalk Services discusses some of the current gaps between BizTalk Server and BizTalk Services. The authors drop some subtle hints around what features are coming that will close the feature parity gap that currently exists.  Based upon the agenda for the upcoming Integrate 2014 event, I suspect this gap will be closed rather quickly.  Which means, the timing to pick up this book, to ensure you understand the fundamentals of the BizTalk Services, before some of these other features are announced provides a great opportunity to ensure you can hit the ground running.

The book can be sourced from both Amazon and PacktPub in both e-book and traditional paperback format.   The Kindle version of the book is a mere $8.96 USD which is a incredible bargain.


Integrate 2014 Summit

$
0
0

 

image

Recently Microsoft and BizTalk360 have announced the continuation of the Global BizTalk Summit that is held annually in the greater Seattle area.  This year the event moves from downtown Seattle to Redmond and has been branded Integrate 2014. The event will take place December 3rd – 5th, 2014 on the Microsoft campus.  If you are interested in Microsoft Integration technologies, this is a must attend event. 

Based upon the current Session Schedule you will be exposed to:

  • New BizTalk Adapter Framework for BizTalk Services
  • New Workflow designer in BizTalk Services
  • New BizTalk Rules Engine in BizTalk Services
  • B2B Improvements in BizTalk Services
  • Internet of Things (Service Bus)
  • Hybrid Connectivity
  • Azure API Management
  • Customer Success Stories
  • Ask the Experts (MVP) interactions

As you can see there is a lot of unique and progressive content on display at this event.  I will be attending the event as will many other familiar faces from the Microsoft Integration community.

Registration is now open and there are some early bird tickets available until November 15th, 2014.

See you there!!!

Integrate 2014 Recap

$
0
0

I have recently made it home from a great week at Redmond’s Microsoft campus where I attended the Integrate 2014 event.  I want to take this opportunity to thank both Microsoft and BizTalk360 for being the lead sponsors and organizers of the event.

image image

I also want to call out to the other sponsors as these events typically do not take place without this type of support.  I think it also a testament of just how deep Microsoft’s partner ecosystem really is and it was a pleasure to interact with you over the course of the week.

imageimageimageimage
imageimageimageimage
imageimageimageimage
imageimageimageimage
imageimage  

Speaking at the event

I want to thank Microsoft and BizTalk360 for inviting me to speak at this event. This was the first time that I have had the chance to present at Microsoft’s campus and it was an experience I don’t think I will ever forget.  I have been to Microsoft campus probably around 20 times for various events but have never had the opportunity to present.  It was a pretty easy decision.

One of the best parts of being involved in the Microsoft MVP program is the international network that you develop. Many of us have been in the program for several years and really value each other’s experience and expertise.  Whenever we get together, we often compare notes and talk about the industry.  We had a great conversation about the competitive landscape.  We also discussed the way that products are being sold with a lot of buzzwords and marketecture.  People were starting to get caught up in this instead of focusing on some of the fundamental requirements.  Much like any project should be based upon a formal, methodical, requirements driven approach, so should buying an integration platform.

These concepts introduced the idea of developing a whitepaper where we would identify requirements “if I was buying” an integration platform. Joining me on this journey was Michael Stephenson and Steef-Jan Wiggers.  We focused on both functional and nonfunctional requirements. We also took this opportunity to rank the Microsoft platform, which includes BizTalk Server, BizTalk Services, Azure Service Bus and Azure API Management.  Our ranking was based upon experiences with these tools and how our generic integration requirements could be met by the Microsoft stack. This whitepaper is available on the BizTalk360 site for free.  Whether your are a partner, system integrator, integration consultant or customer you are welcome to use and alter as you see fit.  If you feel we have missed some requirements, you are encouraged to reach out to us.  We are already planning a 1.1 version of this document to address some of the recent announcements from the Integrate event.

My presentation focused on 10 of the different requirements that were introduced in the paper.  I also included a ‘Legacy Modernization’ demo that highlights Microsoft’s ability to deliver on some of the requirements that were discussed in the whitepaper.  This session was recorded and will be published on the BizTalk360 site in the near future.

 

Recap

Disclaimer: What I am about to discuss is all based upon public knowledge that was communicated during the event.  I have been careful to ensure what is described is accurate to the best of my knowledge.  It was a fast and furious 3 days with information moving at warp speed. I have also included some of my own opinions which may or may not be inline with Microsoft’s way of thinking.   For some additional perspectives, I encourage you to check out the following blog posts from the past week:

Event Buildup

There was a lot of build up to this event, with Integration MVPs seeing some early demos there was cause for a lot of excitement.  This spilled over to twitter where @CrazyBizTalk posted this prior to the event kicking off.  The poster(I know who you are Smile ) was correct, there has never been so much activity on twitter related to Microsoft Integration. Feel free to check out the timeline for yourself here.

Embedded image permalink

Picture Source @CrazyBizTalk

Keynote

The ever so popular Scott Guthrie or otherwise known as “Scott Gu” kicked off the Integrate 2014 event.  Scott is the EVP of Microsoft’s Cloud and Enterprise groups.  He provided a broad update on the Azure platform describing all of the recent investments that have been rolled out.

Picture Source @SamVanhoutte

Embedded image permalink

Some of the more impressive points that Scott made about Azure include:

  • Azure Active Directory supports identity federation with 2342 SaaS platforms
  • Microsoft Azure is the only cloud provider in all 4 Gartner magic quadrants
  • Microsoft Azure provides the largest VMs in the cloud known as ‘G’ Machines (for Godzilla).  These VMs support 32 cores, 448 GB of Ram and 6500 GB of SSD Storage
  • Microsoft is adding 10 000+ customers per week to Microsoft Azure

For some attendees, I sensed some confusion about why there would be so much emphasis on Microsoft Azure. In hindsight, it makes a lot of sense.  Scott was really setting the stage for what would be come a conference that focused on a cohesive Azure platform where BizTalk becomes one of the center pieces.

Embedded image permalink

Picture Source @gintveld

A Microservices platform is born

Next up was Bill Staples.  Bill is the General Manager for the Azure Application Platform or what is also known as “Azure App Platform”.  Azure App Platform is the foundational ‘fabric’ that currently enables a lot of Azure innovation and will fuel the next generation integration tools for Microsoft.

A foundational component of Azure App Platform is App Containers.  These containers support many underlying Azure technologies that enable:

  • > 400k Apps Hosted
  • 300k Unique Customers
  • 120% Yearly Subscription Growth
  • 2 Billion Transactions daily

Going forward we can expect BizTalk ‘capabilities’ to run inside these containers.  As you can see, I don’t think we will have any performance constraints.

Embedded image permalink

Picture Source @tomcanter

Later in the session, it was disclosed that Azure App Platform will enable new BizTalk capabilities that will be available in the form of Microservices.  Microservices will enable the ability provide service composition in a really granular way.  We will have the ability to ‘chain’ these Microservices together inside of a browser(at design time), while enjoying the benefits of deploying to an enterprise platform that will provide message durability, tracking, management and analytics.

I welcome this change.  The existing BizTalk platform is very reliable, robust, understood, and supported.  The challenge is that the BizTalk core, or engine, is over 10 years old and the integration landscape has evolved with BizTalk struggling to maintain pace.

BizTalk capabilities exposed as Microservices puts Microsoft in the forefront of integration platforms leapfrogging many innovative competitors.  It allows Microsoft’s customers to enable transformational scenarios for their business.  Some of the Microservices that we can expect to be part of the platform include:

  • Workflow (BPM)
  • SaaS Connectivity
  • Rules (Engine)
  • Analytics
  • Mapping (Transforms)
  • Marketplace
  • API Management

Embedded image permalink

Picture Source @jeanpaulsmit

We can also see where Microsoft is positioning BizTalk Microservices within this broader platform: 

Embedded image permalink

Picture Source @wearsy

What is exciting about this is new platform is the role that BizTalk now plays in the broader platform.  For a while now, people have felt that BizTalk is that system that sits in the corner that people do not like to talk about.  Now, BizTalk is a key component within the App Platform that will enable many integration scenarios including new lightweight scenarios that has been challenging for BizTalk Server to support in the past.

Whenever there is a new platform introduced like this, there is always the tendency to chase ‘shiny objects’ while ignoring some of the traditional capabilities of the existing platform that allowed you to gain the market share that you achieved.  Microsoft seems to have a good handle on this and has outlined the Fundamentals that they are using to build this new platform.  This was very encouraging to see. 

Embedded image permalink

Picture Source @wearsy

At this point the room was buzzing.  Some people nodding their heads with delight(including myself), others struggling with the term Microservice, others concerned about existing requirements that they have and how they fit into the new world.  I will now break down some more details into the types of Microservices that we can expect to see in this new platform.

Workflow Microservice

One of the current gaps in Microsoft Azure BizTalk Services (MABS) is workflow.  In the following image we will see the workflow composer which is hosted inside a web browser.  Within this workflow we have the ability to expose it as a Microservice, but we also have the ability to pull in other Microservices such as a SaaS connector or a Rules Service.

Embedded image permalink

Picture Source @saravanamv

On the right hand corner of this screen we can see some of these Microservices that we can pull in.  The picture is a little “grainy” but some of the items include:

  • Validation
  • Retrieve Employee Details (custom Microservice I suppose)
  • Rules
  • Custom Filter
  • Acme (custom Microservice I suppose)
  • Survey Monkey SaaS Connector)
  • Email (SaaS Connector)

Embedded image permalink

Picture Source (@mikaelsand)

In the demo we were able to see a Workflow being triggered and the tracking information was made available in real time.  There are also an ability to schedule a workflow, run it manually or trigger it from another process.

Early in the BizTalk days there as an attempt to involve Business Analysts in the development of Workflows (aka Orchestrations).  This model never really worked well as Visual Studio was just too developer focused, and Orchestration Designer for Business Analysts (ODBA) just didn’t have the required functionality for it to be a really good tool.  Microsoft is once again attempting to bring the Business Analyst into the solution by providing a simple to use tool which is hosted in a Web browser.  I always am a bit skeptical when companies try to enable these types of BA scenarios but I think that was primarily driven from workflows being defined in an IDE instead of a web browser.

Embedded image permalink

Picture Source @wearsy

Once again, nice to see Microsoft focusing on key tenets that will drive their investment.  Also glad to see some of the traditional integration requirements being addressed including:

  • Persist State
  • Message Assurance
  • End to end tracking
  • Extensibility

All too often some of these ‘new age’ platforms provide lightweight capabilities but neglect the features that integration developers need to support their business requirements. I don’t think this is the case with BizTalk going forward.

Embedded image permalink

Picture Source @wearsy

SaaS Connectivity

A gap that has existed in the BizTalk Server platform is SaaS connectivity.  While BizTalk does provide a WebHttp Adapter that can both expose and consume RESTful services, I don’t think it is enough (as I discussed in my talk).  I do feel that providing a great SaaS connector makes developers more productive and reduces the time to deliver projects is mandatory.  Delivering value quicker is one of the reasons why people buy Integration Platforms and subsequently having a library that contains full featured, stable connectors for SaaS platforms is increasingly becoming important.  I relate the concept of BizTalk SaaS connectors to Azure Active Directory Federations.  That platform boasts more than 2000+ ‘identity adapters”.  Why should it be any different for integration?

The following image is a bit busy, but some of the Connector Microservices we can expect include:

  • Traditional Enterprise LOBs
  • Dynamics CRM Online
  • SAP SuccessFactors
  • Workday
  • SalesForce
  • HDInsight
  • Quickbooks
  • Yammer
  • Dynamics AX
  • Azure Mobile Services
  • Office 365
  • Coupa
  • OneDrive
  • SugarCRM
  • Informix
  • MongoDB
  • SQL Azure
  • BOX
  • Azure Blobs and Table
  • ….

This list is just the beginning.  Check out the Marketplace section in this blog for more announcements.

Embedded image permalink

Picture Source @wearsy

Rules Microservice

Rules (Engines) are a component that shouldn’t be overlooked when evaluating Integration Platforms.  I have been at many organizations where ‘the middleware should not contain any business rules’.  While in principle, I do agree with this approach.  However, it is not always that easy. What do you do in situations where you are integrating COTS products that don’t allow you to customize?  Or there may be situations where you can customize, but do not want to as you may lose your customizations in a future upgrade. Enter a Rules platform.

The BizTalk Server Rules Engine is a stable and good Rules Engine.  It does have some extensibility and can be called from outside BizTalk using .NET.  At times it has been criticized as being a bit heavy and difficult to maintained.  I really like where Microsoft is heading with its Microservice implementation that will expose “Rules as a Service” (RaaS?  - ok I will stop with that). This allows integration interfaces to leverage this Microservice but also allows other applications such as a Web or Mobile applications to leverage.  I think there will be endless opportunities for the broader Azure ecosystem to leverage this capability without introducing a lot of infrastructure.

Embedded image permalink

Picture Source @wearsy

Once again, Microsoft is enabling non-developers to participate in this platform.  I think a Rules engine is a place where Business Analysts should participate.  I have seen this work on a recent project with Data Quality Services (DQS) and don’t see why this can’t transfer to the Rules Microservice.

Embedded image permalink

Picture Source @wearsy

 

Data Transformation

Another capability that will be exposed as a Microservice is Data Transformation (or mapping).  This is another capability that will exist in a Web browser.  If you look closely on the following image you will discover that we will continue to have what appears to be a functoid (or equivalent).

Only time will tell if a Web Browser will provide the power to build complex Maps.  One thing that BizTalk Server is good at is dealing with large and complex maps.  The BizTalk mapping tools also provide a lot of extensibility through managed code and XSLT.  We will have to keep an eye on this as it further develops.

image

 

Analytics

Within BizTalk Server we have Business Activity Monitoring (BAM).  It is a very powerful tool but has been accused of being too heavy at times. One of the benefits of leveraging the power of Azure is that we will be able to plug into all of those other investments being made in this area.

While there was not a lot of specifics related to Analytics I think it is a pretty safe bet that Microsoft will be able to leverage their Power BI suite which is making giant waves in the industry.

One interesting demo they did show us was using Azure to consume SalesForce data and display it into familiar Microsoft BI tools.

I see a convergence between cloud based integration, Internet of Things (IoT), Big Data and Predictive analytics.  Microsoft has some tremendous opportunities in this space as they have very competent offerings in each of these areas. If Microsoft can find a way to ‘stitch’ them all together they were will be some amazing solutions developed.

Picture Source @wearsy

Below is a Power BI screen that displays SalesForce Opportunities by Lead Source.

Picture Source @wearsy

Marketplace - Microservice Gallery

Buckle your seatbelts for this one!

Azure already has a market place appropriately called Azure Marketplace. In this Marketplace you can leverage 3rd party offerings including:

  • Data services
  • Machine Learning
  • Virtual Machines
  • Web applications
  • Azure Active Directory applications
  • Application services

You can also expect a Microservice Gallery to be added to this list.  This will allow 3rd parties to develop Microservices and add them to the Marketplace.  These Microservices can be monetized in order to develop a healthy eco-system.  At the beginning of this blog post you saw a list of Microsoft partners who are active in the existing Integration eco-system.  Going forward you can expect these partners + other Azure partners and independent developers building Microservices and publishing them to to this Marketplace.

In the past there has been some criticism about BizTalk being too .Net specific and not supporting other languages.  Well guess what? Microservices can be built using other languages that are already supported in Azure including:

  • Java
  • Node.js
  • PHP
  • Python
  • Ruby

This means that if you wanted to build a Microservice that talks to SaaS application ‘XYZ” that you could build it in one of this languages and then publish it to the Azure Marketplace.  This is groundbreaking.

The image below describes how a developer would go ahead and publish their Microservice to the gallery through a wizard based experience.Embedded image permalink

Picture Source @wearsy

Another aspect of the gallery is the introduction of templates.  Templates are another artifact that 3rd parties can publish and contribute.  Knowing the very large Microsoft ISV community with a lot of domain expertise this has the potential to be very big.

Some of the examples that were discussed include:

  • Dropbox – Office365
  • SurveyMonkey – SalesForce
  • Twitter – SalesForce

With a vast amount of Connector Microservices, the opportunities are endless.  I know a lot of the ISVs in the audience were very excited to hear this news and were discussing what templates they are going to build first.

Embedded image permalink

Picture Source @nickhauenstein

What about BizTalk Server?

Without question, a lot of attendees are still focused on On-Premises integration. This in part due to some of the conservative domains that these people support. Some people were concerned about their existing investments in BizTalk Server.  Microsoft confirmed (again) their commitment to these customers.  You will not be left behind!  On the flipside, I don’t think you can expect a lot of innovation in the traditional On-Premises product but you will be supported and new versions will be released including BizTalk Server 2015.

You can also expect every BizTalk Server capability to be made available as a Microservice in Azure. Microsoft has also committed to providing a great artifact migration experience that allows customers to transition into this new style of architecture.

Embedded image permalink

Picture Source @wearsy

Conclusion

If there is one thing that I would like you to take away from this post it is the “power of the Azure platform”.  This is not the BizTalk team working in isolation to develop the next generation platform.  This is the BizTalk team working in concert with the larger Azure App Platform team.  It isn’t only the BizTalk team participating but other teams like the API Management team, Mobile Services team, Data team and  many more I am sure.

In my opinion, the BizTalk team being part of this broader team and working side by side with them, reporting up the same organization chart is what will make this possible and wildly successful.

Another encouraging theme that I witnessed was the need for a lighter weight platform without compromising Enterprise requirements.  When you look at some of the other platforms that allow you to build interfaces in a web browser, this is what they are often criticized for.  With Microsoft having such a rich history in Integration, they understand these use cases as well as anyone in the industry. 

Overall, I am extremely encouraged with what I saw.  I love the vision and the strategy.  Execution will become the next big challenge. Since there is a very large Azure App Platform team providing a lot of the foundational platform, I do think the BizTalk team has the bandwidth, talent and vision to bring the Integration specific Microservices to this amazing Azure Platform.

In terms of next steps, we can expect a public preview of Microservices (including BizTalk) in Q1 of 2015.  Notice how I didn’t say a BizTalk Microservices public preview?  This is not just about BizTalk, this about a new Microservice platform that includes BizTalk.  As soon as more information is publicly available, you can expect to see updates on this blog.

2014 Year in Review and looking ahead to 2015

$
0
0

It is that time of year where I take a step back and reflect on the previous year and then start thinking about the upcoming year.  2014 was an incredible year, one that I will not forget any time soon.  I did a lot of travelling in 2014 and met a lot of great people, for this I am grateful for the opportunities that allowed this to happen.

So let’s get into it, here are some of the events that took place over the past year.

Released another book

Johan Hedberg, Morten la Cour and myself released a second edition of the (MCTS) Microsoft BizTalk Server (70-595) certification book to include new BizTalk 2013 content. We released the book in March, 2014 as it also co-incided with the launch of the Microsoft Partner Assessment exam that Microsoft uses to measure its Silver or Gold Application Integration partners.

If your are interested in getting certified, either personally or as part of the Microsoft Partner network, I still feel that it is a great resource that will aid you in achieving your goal.

The Book is available from both Amazon and Packt

image

European Adventure

In March, I had the opportunity to head to Europe to present at a few events.  These events included the BizTalk Summit in London on March 3rd/4th, Swedish BizTalk User group in Stockholm, Sweden on March 5th and the Netherlands BizTalk User group on March 6th.

Special thanks to Saravana Kumar, Johan Hedberg and Steef-Jan Wiggers for allowing me to participate in these different events in your countries.  The BizTalk community is second to none in Europe so it is always an honour to head over there.

Even though it was an extremely busy week, we did have some time to see the sights in London. But otherwise a lot of time was spent at airports and in airplanes.

MuleSoft

After my return from Europe, I made a career change and joined MuleSoft. Event though MuleSoft has its headquarters in San Francisco I was able to sign on as a Remote Solutions Architect.

In 2014, MuleSoft made a lot of investments in supporting the Microsoft eco-system.  Most of the activities I was involved with included Microsoft at some level. The highlight of my time there would have to be getting involved in the release of a couple of investments related to calling .NET code (.Net Connector) from a Mule workflow and the MSMQ connector. I had the opportunity to demonstrate these components at MuleSoft’s flagship event called Connect in San Francisco in May 2014.

I decided to leave MuleSoft at the end of September 2014.  The logistics of me being a remote worker in Canada created some challenges and as a result I decided to move on.   Overall, I value the time I spent at MuleSoft.  I got to learn about a new eco-system, met a lot of great, smart people and got exposure to API Management which I hadn’t been exposed to previously.

After leaving Mulesoft, I traded my eclipse IDE for Visual Studio.  I have shifted my attention back to the Microsoft Integration stack including BizTalk, Azure Service Bus and Azure API Management. 

MVP Summit

I was able to attend the Annual MVP Summit in Redmond.  It was another great opportunity to talk shop with my integration colleagues from around the globe. Since the Summit has moved to the fall instead of Spring, we have developed a bit of a tradition by heading to Lowell’s, in the Pike Market, for breakfast before heading to a Seattle Seahawks game.  Even though it rained pretty much the whole game it was still a fun way to to spend a Sunday ffternoon.

 

Microsoft had a bit of a surprise for us once the sessions started.  It was at this point that we started to hear more about the next generation platform that they were building.  This more than kept us occupied in many discussions through-out the week. 

 

Channel 9

The amount of brain-power that shows up for MVP Summit is flat-out impressive.  Some of the brightest minds in the industry are present and eager to share some of their experiences.  Mark Mortimore, who runs marketing for BizTalk, decided to tap into some of this knowledge and have people record some short demos in advance of the Integrate 2014 event.  I took this opportunity to discuss a scenario I had built in the spring involving capturing data from a Power Plant and making it available to mobile devices via BizTalk, Azure Service Bus and Azure Mobile Services. You can view my clip and others here.

image

Integrate 2014

Microsoft held its annual Integration event in North America in December 2014.  I had the opportunity to speak at this event. My topic involved considerations when choosing an Integration Platform.  The Integration Platform landscape has been very crowded lately and there are a lot of mixed messages flying around.  The point of my session was to take a step back an look at a requirements based process to selecting a platform.  I discussed some of the top requirements that I would be looking for in a platform.

I also took this opportunity to announce a Whitepaper that myself, Michael Stephenson and Steef -Jan Wiggers had been working on calledChoosing the right Integration Platform. This whitepaper is available for free on the BizTalk360 site.

image

The Integrate 2014 event had an incredible amount of announcements and gave attendees a lot of insight into where Microsoft is headed with Integration.  You can find a more detailed recap of the event here.

ASU

During all of these activities, I was also pursuing a Master’s Degree.  In December 2014, I graduated from Arizona State University with a 4.0 GPA.  The program that I was enrolled in was called MSIM (Masters of Science in Information Management).  The program is delivered on-campus and online through the W.P. Carey School of Business.  For the past 16 months I have spent approximately 20 hours a week in the program.

It was a great experience and I am glad that I had the opportunity to go through it.  I had some great teammates in my group and we worked really well together even though both of them are in Phoenix and I am in Calgary.

2015

In 2015, I am definitely looking for a slower pace and being able to spend more time with my family. While I will continue to focus on technology with my spare cycles, I definitely plan to spend more time skiing and running this year.  Between school and doing a lot of travel last year these are some of the activities that I had to concede.

From a technology perspective, I certainly plan to dive deeper into the upcoming Microsoft Microservices offering, Azure API Management, Azure Event Hubs and Stream Analytics.

MVP Award

I did receive word that my MVP status has been renewed for 2015.  I want to thank both my Canadian MVP leads Sim and Joel and the Product Group leadership team  Vivek, Guru and Mark for keeping me in the program even though I spent some time away in another platform.  2015 is going to be a really big year for Microsoft in the Integration space and I am thrilled to get involved.

I also want to take this opportunity to thank all my twitter followers and readers of this blog for their continued interest in what I have to say.  The Microsoft Integration community is really amazing and I am inspired to be part of it.  Here’s to a great 2015!

Slides from Calgary Azure Dev Camp

$
0
0
On January 10, 2015 I had the opportunity to speak about the Azure Service Bus.  This was an introductory presentation on the subject, but I  was able to sneak in a more complex demo at the end that includes Azure Mobile Services and BizTalk.  I have uploaded the slides to slideshare below.  If you want to see a video of the the final demo you can view that on Channel 9.


Upcoming Presentation – February 2nd, 2015

$
0
0

Recently Integration MVPs Michael Stephenson and Saravana Kumar have launched a series of community-focused integration events.  What is unique about these events is that they have been designed to cater to an International audience.  Live Webinars have been held every Monday evening (UK time) for the past couple weeks and for several more in the near future.  They have created a website, appropriately called http://www.integrationusergroup.com/ where you can find all the details for these events.

When Michael asked if I was willing to present the talk I gave at the Integrate 2014 event I was happy to oblige.  On Monday, February 2nd I will be discussing “What to look for in an Integration Platform”.  I have included the overview for the session below and want to take this opportunity to thank Michael, Saravana and the BizTalk360 team for hosting these events and providing a great service to the Integration community.

Overview

The landscape for integration platforms continues to evolve. Cloud, Hybrid, Mobility, SaaS connectivity, API Management and MicroServices are introducing new architectural patterns and requirements. These new capabilities are also introducing new entrants in the integration platform industry. For customers who are looking to upgrade, or adopt a new integration platform, this can be an overwhelming and confusing situation. In this talk, Kent will take a requirements driven approach when selecting an integration platform.

http://www.integrationusergroup.com/?event=what-to-look-for-in-an-integration-platform&event_date=2015-02-02

Viewing all 75 articles
Browse latest View live