Thursday, October 16, 2008

Scaling out MOSS 2007 implementation using Windows network load balance clusters (NLB)

Scaling out an existing implementation or implementing high available performance clusters are always challenge, I have hammered my head many time with the requirement with different Microsoft product like BizTalk, SQL Server, MOSS 2007, SharePoint 2003 etc. From my experience I can say to scaling out MOSS 2007 implementation or implementing high performance cluster is comparatively easier then other MicroSoft products. Well if we talk about performance, so I would say it's majorly depend on two things, back end SQL Server is the first and WFE (Web Front-End) is the second. There are some other stuff which are also important for achieving high performance and availability in implementation but here I would like to mainly focus about WFE and MOSS 2007 Farm implementation on Windows network load balance (NLB) cluster in this post.

To demonstrate, I have used two Windows Server 2003 on single NLB cluster. I would also like to discuss some other windows features which are very helpful in such implementation like network connectivity and physical connections, IIS, NLB manager tool.

  • Network connectivity and physical network design is very important to achieve best of it, network between servers should be very fast and high available and network design should have minimum possible latency between two servers. 

  • To get best out of IIS 6.0, we can implement compression stream, it also improve performance to some extend.

  • If the physical servers haven't balanced configuration then I would prefer to configure priority and also set distribution bandwidth according to amount of performance excellence between them by using NLB manager. If WFE servers across your farms are not running smiler services or not having same configuration then users or requests can experience imbalance depend on the server which is serving them.
NLB Manager

Host Properties

You can see above how to configure load distribution between servers on specific ports if requires.

Well, now we are all set to implement MOSS 2007 in Windows Server 2003 NLB environment.

I have installed Office Server 2007 on both node and selected all component to install, here it's up to need, weather you are planning to dedicate the server for WFE or not (Which will be the requirement in real world scenarios).

That's it, what am I read "That's it"!!!, yes you are right, you just need to install on NLB nodes, another important thing is load balanced URL, I have already made an entry into DNS server with NLB domain and portal URL with NLB virtual IP address, so when I create portal (SharePoint application), it can resolve header and translate name address to virtual IP.

DNS Manager

Now there could be two questions, if this is fresh installation then how to create SharePoint application on load balance or if it's scale out installation then how to extend the existing application. Not a big deal, below are the answers.

How to create new SharePoint application on NLB node?

1. Go to create application page using Application Management in Central Administration Site.

2. Configure the application as standard new SharePoint application. But make sure you are providing the DNS entry we have made for load balanced URL (which translate the virtual NLB cluster IP).

3. After creating application you can notice all load balanced IIS servers are been populated with new SharePoint application. All IIS node servers now have physical website on separate location and respective entry in IIS website.

Bingo!!!!, we are done, now you are on load balance application, let me access the application from different client machine and let's check IIS logs for the websites that which servers are serving our requests.

WIN 1 server's log:

WIN 2 server's log:

You can see above logs' snapshots and see randomly both servers are serving our HTTP GET/POST requests.

We are ON SPOT!! Again BINGO!!

Now the second question, How to extend existing SharePoint application on scaled out NLB Mode?

A very simple answer for that question, you can extend the existing SharePoint application by extending web application to another IIS application from Application Management in Central Administration portal. Make sure you use load balanced URL for the same as we have used in above example.

Hope you like this article, please give your feedback to

Thursday, October 2, 2008

Install SQL Server in Windows Cluster environment as failover cluster.

Last few weeks working on building custom cluster environments and custom applications which can support cluster. So took that chance to take some snapshots of SQL Server cluster installation. As other MicroSoft product, SQL Server Cluster installation is very much easy and simple. Let's have a look at below procedures need to follow while installing SQL Server in Windows Cluster Environment. I setup my windows cluster environment from use of Virtual Server 2005 and build two Windows environments and took them both in cluster. Windows cluster configuration is not in the scope of the this article. I will surely produce the article for Windows Cluster too. But for this article I would like to focus on SQL Server 2005 Cluster environment.

The same standard SQL Server installer allows you to set up the SQL Cluster, but if your host OS is Windows Cluster then it will enable few options while installing service for enabling failover cluster for SQL Server Database Engine and Analysis service, no other service support clustering. So let's see below steps to install Database Engine as failover cluster.

1. Follow the standard SQL Server Installation steps but at the screen while you are selecting components to install or upgrade you will notice one check box option labeling "Create a SQL Server failover cluster" for SQL Server Database Service and Analysis Service only, as these both service only support failover clustering and cluster implementation. Please have a look to the below screen, you must check the SQL Server failover cluster option to continue installation or upgrade in cluster mode.

2. If you haven't checked box for cluster installation or upgrading then it following screen will be standard installation installation wizard screen, else you will see the below screen in order to setup the cluster. Every cluster need the virtual server name, that virtual name behave mostly like named instance, you can provide the virtual server name while setting up connection strings to connect databases or while managing cluster through management studio. Please make sure few important checks before naming the SQL Cluster, to avoid conflicts in name and network resolution please provide unique network object name (like computer name, host names etc...). Refer below screen.

3. The next wizard screen is very important to configure the cluster server's virtualization configuration. The Virtual Server Configuration screen will allow you to setup virtual load balance IPs for the networks (These networks are only cluster virtual network which generally we configure while setting up Windows cluster only). You can setup load balance virtual IP for those networks, I will surely suggest you to create public and private virtual IP separately and avoid heartbeat network channel for the cluster for the performance benefits. Refer the below two screen.

4. The next screen will allow you to add the resource to respective manageable cluster group, in my setup I have just created one cluster group so there is not much choice for me to select :-) but it's a really good practice to manage proper cluster group for better and clean administration. Another important input the screen show is data file's location, it will enumerates the physical drives you have added as the resources in cluster. Ideally the drive should be accessible and shared among all cluster servers and should have valid and proper permissions for required operations. Ideally it should be SAN or SCSI device. Also would advice to cross check after installation that physical drive resource has been added to dependencies in SQL Server resource.

5. The next screen will be for configuring Cluster Node. It will allow you to select cluster nodes. It was my first cluster node installation so just displaying single node in list. Other node you can notice under the label "required node".

6. BINGO.... You are back to main land, and standard installation starts, the following steps would be smiler then regular non cluster installation.

SQL Cluster installation is very simple to install and also to manage. I hope the article helps you to install it first time on your machine. Will try to come up with new articles on clusters and installation. I recently setup my R&D environments in clusters and enjoying to working on it. Next installation will be MOSS 2007 on cluster, BizTalk 2006 R2 installation. Will come up with those articles soon.

Hope you like this article, please give your feedback to

Tuesday, September 23, 2008

System Center Capacity Planner 2007 - A nice point to kick-off IT Infrastructure planning and design for MOSS 2007

System Center Capacity Planner 2007 - A nice point to kick-off IT Infrastructure planning and design for Microsoft Office SharePoint Server 2007 (MOSS 2007).

Recently, I have been trying and exploring number of best practices to design and plan IT Infrastructure especially for BizTalk and SharePoint Portal Server implementation. In my exploration drive I come across a very nice tool called System Center Capacity Planner 2007. Here in the post I would like to focus just on MOSS 2007.

The tool allows you to plan hardware, servers and service across organization, different location, etc. There nice demo and help already available inside tool itself so would not like to paste here but there are some snapshot, I would like to post along with the blog post.

You can download the tool from below link.

The tool also provides nice wizard to start planning. Have a look to some below snapshots for designs and reports.

1. Global Topology View

I tried to build organization's global topology. Where server farm is located in one continental and that is connected to various offices, branches across the glob. To evaluate such need for implementation, hardware design, scale out processes for services, service and load balance distributions are tricky and time consuming. System Center Planner 2007 perfect tool for such analysis and design. It simulate load and provide stats which could be very helpful for such design and planning.

2. Site Topology View.

3. Server Role Configuration, to design scale out topology it's very important. Also help while making decision to buy new hardware and scale up/out planning.

4. User Profiles which are role based and component base. Allows tool to predict more accurately.

5. Simulation Report.

Tool generate rich statistics in report, where you can estimate with such number of users and dependent on their role, latency in network what will be longest transactions, I/O utilization, Highest CPU Utilization, Bottleneck analysis, Highest storage space utilization, connectivity utilization, etc. It also generate different stats for all servers, clients, SANs, WANs and Thresholds.

Please send your views and feedback at


Sunday, August 10, 2008

Microsoft Dynamics Ax 4.0 and Integration Methods (Part-1)

Microsoft Dynamics Ax 4.0 and Integration Methods (Part-1 WebServices -
Application Integration Framework, AIF)

My Dynamics Discovery :-) (nice term) reached to bank of the river "AXAPTA",
few days ago ... Well I have worked for Axapta in a very small project but
long time ago and with previous version too. So it's my turn to refresh and
edge my knowledge on technology Ax. Blogging is my hobby and I can stop my
self to write the first post of the series.

These days if something is making me exciting is Microsoft Dynamics and bet
this is the most hottiee topic these days among Microsoft Professionals. Let's come to the technical subject. Axapta support many integration methodologies to get Integrated across the platforms, like BizTalk Adapter (Axapta), WebServices, X++ AOS Classes and .NET / WCF. Here in this part of article I am focusing on AIF WebServices and possible innovation, I humbly invite you to share your views and ideas on this article.

AIF already has very strong and competitive AIF infrastructure to develop and
config functional WebServices which has direct interaction at AOS Level
(Application Logic). To create such AIF WebServices you need can refer below
steps. In step of creating WebService integration point (Action) you must have at least one WebSite configuration to create respective webserice and supportive files. That website should have configured with Windows integrated authentication as it supported by default in web.config, later you can change and reconfigure it.

1. Configuration of WebSite.

2. Adding WebServices inside Actions.

By clicking on generate button, Axapta will populate all webservices, reference assemblies in Bin folder as well as supportive schema documents in web share folder. Axapta only generate 'Microsoft.Dynamics.IntegrationFramework.WebService.dll' assembly in Bin folder and you need to place dependencies like 'Microsoft.Dynamics.IntegrationFramework.WebService.Process.dll' and localization resource assemblies in respective folder of the ISO Standard name in Bin folder. If you are configuring it in default location it will be present there but for another location you need to configure it manually.

3. Review the Web.Config file.

4. Generated files.

5. WebService view.

6. WebService Method view.

Till here the article demonstrate to configure AIF WebServices with default processes. Axapta also support advance customization and on top logic over Axd classes. Here for an example I am going to explore the customization in logic for AxdSalesOrder class. As well as any new Axd sys level class can provide action point and ability to generate such webservices to integrate data between systems. Please have a look to below steps to do advance customization in AIF actions.

7. Code help explorer.

Code help explorer can help and guide you about the classes and diffrent methods they support, by using it you can find out meta information for every action class and their methods as well as cross references.

8. Class tree to explore AxdSalesOrder Class.

9. Edit and customize code logic inside AxdSalesOrder.

You can see above the all implemented logic behind AxdSalesOrder for action methods. You can customize it as per the requirement.

In further to discuss about the AIF WebServices, they are inbound as well as outbound data services, these days when service orientation is on demand and proving it self as very cost efficient, open end and system friendly in such age AIF is big step towards make implementation and organization more service oriented. As WSE* standards are fully compatible with across the table and technologies as well as if I focused on Microsoft only then newly introduce technologies like WCF and WF could be also do well in such domain with Workflow processing and open end secure communication with Axapta. Defiantly web site integration will be a great advantage of the method too. I welcome your comment and views on the thoughts.

Hope you like my article and please give your feedback to

Sunday, June 29, 2008

Experience on SSIS and SAP, inspired after viewing some presentation

Experience on SSIS and SAP, inspired after viewing some presentation.

I inspired for the article from one article posted on MSDN blog about SAP and SSIS. It's really cool stuff from Matt Masson in PowerPoint presentation.

Please have a look into the below presentation by Matt Masson. it's really great to know about SAP and SSIS!

BizTalk Adapter Pack (SAP) installation

BizTalk Adapter Pack (SAP) SSIS integration

BizTalk Adapter Pack (SAP) SSIS integration (Advanced)

Below I am just sharing my experience with SSIS and SAP BW/APO.

I have worked in one of project doing integration between SAP BW/APO modules to other ERPs (like JDE, PeopleSoft etc). For that we used to workout everything by building IDocs inside a SSIS Script Component. It's very time consuming but right now I don't think rather then BizTalk there is any simple option available to build and submit IDocs. I Used to build all require segments like Control, Header and Data inside script task and then composing message through Flat File Connection Manager to build IDoc.

As of my best knowledge currently no tool available in market which can give full functionality to play around IDocs inside SSIS (may be I am wrong, you need to search extensive what such features you are looking for and what are available). But the best option I have implemented almost one year ago in my first SSIS implementation was creating IDocs programmatically inside SSIS. Obviously, I need to implement my logic inside script activity and flat flie connection manager, it make system bit slow but still configurable and acceptable. It is time consuming for initial period to build logic but I think once we build it, it's best option to stick on your requirement with few changes in your code to build new structured IDoc.

Really hoping from Microsoft to have SAP Connection Manager if anything near to automation/parser and connection manager it will be adding great value to product and will be great help for such people don't want to connect to SAP DB directly and want to push data by safer IDoc option.

At end really want to appreciate the efforts from Matt for such presentation.

Thanks for reading my views and please contact me for your feedback or suggestion on

Thursday, June 26, 2008

Looking beyond to ETL from SSIS, next door can be EII - Enterprise Information Integration.

Below paragraph is pre-face of the article, and it's mentioned about inspiration of the article, if you wish to skip it please go to next paragraph. :-).. But I would recommend reading it.

Recently many things cooking around Microsoft Data Services, I hope it wills surely include more services which can be much helpful to implement difficult EII scenarios. New term Database as a Service (DaaS, name cloning from SaaS). Frankly speaking these days war between terminologies is far higher then technologies, aaahh forget about it, it's very confusing which do what!!!! nowadays, terminologies are becoming face of product and patterns are became abstract :-). Anyways let's come to core part of article, actually I was working on very nice requirement where system need to provide a merged data view from multiple system (heterogeneous sources and also multi-cultured also :(..). Previously we used to do complete data integration (As EAI Implementation) with moderate latency across whole system into one web application which provides grid view and reports; it was very frequently used by users and very important too. But this integration consumes 200GB of database for just viewing and reporting purpose (we keep only 15 days data and for that we have to invest minimum of 250 GB data storage on SAN and other resources for implementation of EAI like BizTalk server, message box database, traffic, bandwidth etc...). It was clearly an requirement of EII implementation, but previously it hadn't been identified as EII requirement and individual implementation, too. But after massive discussion (fights) we were implement that system as EII with use of SSIS (SQL Server Integration Services 2005) only. From that time I had the inspiration of this article. Thanks for reading it.

Back to Business:

Microsoft.SQLServer.Dts.DtsClient name space, I am SSIS developer too, but was not much familiar about its functionalities earlier. As a problem I described in above paragraph we design few complex SSIS packages with heterogeneous data sources and data views, merge joins, lots of shorting, querying, cleansing, etc. Here for demonstration I am unable to saw original design but will follow one sample design I have developed specially for article.

In below design, I am trying to querying to two different data sources which have entirely different databases and connected through one common key and I am trying to generate single dataset for viewing and reporting purposes. You can see below data table snapshots where first figure shows first names of people and second table shows last name of people. Both tables are in different database. Here data sources could be any kind of supportive sources inside SSIS for the concept.

Fig 1. Database Tree (two different databases - test_1, test_2).

Fig 2. Table 1 (Db1- table contain first name information).

Fig 3. Table 2 (Db2 - table contain last name information).

Now have a look to sample SSIS package, just I am doing merge operation and storing into data reader as a destination.

Fig 4. Sample SSIS package.

I have also developed one console .Net application for testing and demonstration purpose to call SSIS package as single point data connection which can provide me processed and integrated data within one single record set. Below is sample snapshot of coding, in real-time solution we used WCF services to provide external domain and other application as an exposer of the EII implementation. But here I am trying to demonstrate through C#.Net Console Application.

Fig 5. Console Application Code (C# Console Application).

Fig 6. Receiving data into one data reader.

Fig 7. Output.

You can see above snapshot of output that I received integrated query's result in single set.

Isn't it good? There are many benefit of such implementation I can say. The major benefit is shrink EAI landscape which will decrease support team size (Cost... Managers are very happy to listen, additional benefit) and we can free resources to better performance of it. Another important benefit is, we can avoid non-require expose of database and our precious data to other system which only need for viewing purpose, we can archive it by providing and implementing parametrise/on demand EII Data Services (SaaD :-D). I don't want to go inside other benefits we can archive from such implementation but defiantly it could turn into beneficial implementation for every organization which has such requirement.

Please share your views and comments as well as if you have any suggesting please free to suggest me on

Thanks for visiting my blog.

Wednesday, June 25, 2008

MOSS 2007 - Workflows and it exposes to Enterprise Information Integration and Enterprise Application Integration

I am personally very excited about MOSS 2007 and it's brand new key futures, from last few weeks I am really involving my self to see MOSS's new key futures and it's innovative implementation(s) for those areas intersecting with Integration, rather it's Enterprise Information Integration (EII), Enterprise Application Integration (EAI) or Data Integration (ETL). I have worked in my past with SharePoint Portal Server 2003 and it was really complicated to integrate different user activities/events with across the enterprise through BizTalk and Web services. Always it dragged me to develop more custom components/web parts/web controls and calling web services, transferring information and events through BizTalk server. Looks really complicated but believe me it was much tougher then it looks...

Now at new mission I am trying to find out more flexible integration/Work flow base implementations, here I have developed one sample solution to support my view on the Information Integration within MOSS 2007 Tasks and announcement. I am primarily trying to develop some workflow which can pick announcement which is direct live integration with sample Loan Application Management (any custom application generate application details) through WF (which hosts on Windows Services and connected through WCF for my sample, it's open space for any integration tool provide direct integration with MOSS site like BizTalk ;-) etc...). Let me comeback to sample, now I have developed one pilot Workflow which pick each announcement and generate tasks on SharePoint as well as through WCF/Web services it integration with external applications too. Another Workflow application get invoked on completion event of those Task and automatically generate following to do tasks and assign automatically.

Now these workflow also provide me standards functionalities of Windows Workflow Foundation, like listen, event base routing (you can check my previous articles to know more about event base routing), it's really empower developer to go beyond the classic and traditional boundaries of SharePoint sites to next generation workflow and information oriented collaboration development.

Let's back to sample, workflow1 picks announcement and process it and generate proper Task and other artifices, too. In the process of generating automated task and artefacts it's utilize many EII services to authenticate credit and other historical information of an applicant.

You can see below snap, first activity shape will invoke the workflow on any new announcement arrival or creation. Then the workflow is trying to received data and contract through Get activity and parallel processing other code too. At end of successful processing of whole parallel shape it will create all tasks needed for the announcement and at end it will log whole history. Please have a look at figure-1 and figure-2.

After deployment on to your MOSS Site you can check the workflow status by going to Site Settings->Galleries->Workflows. You can see figure-3 and figure-4 for site navigation and Workflow Collection. Site Collection Workflow will show you the association of the workflow to different activities and In-progress workflows too.

Let's come to another sample workflow of the pilot solution, which watch on every complete event of Task and according to the event it flow the application further for consideration and human approval. Where it also involve it self into request response with Web services and EII Service contract through workflow service. And similarly create further to do events in MOSS site.
Have look to MOSS Site Snaps.

1. Creating announcement for sample Loan Application.

2. You can see below figure announcement has been added to list. It shows 11111 id for loan application.

3. After sometime it was picked by workflow and processed, in solution I have appended string called " Processed By...", you can see here it has updated title of announcement and also added new relevant task for the loan application. During the creating of new task it went though to WCF application and receive back some valuable credit information and rating.

4. Task's detail page, you can see here generated description and information.

5. Once Task has been completed another workflow2 processed it and generated siblings task and to-do also it updated the title of the task (You can track related artifacts by "11111" id).

Such futures of MOSS 2007 make life easy for Integration developer and we can also develop such infrastructure suits MOSS 2007 features which help more easy and quick development for complex definitions.

Previously I was using BizTalk adapter or custom .net code but it always want huge resource to implement such workflow and integration for SharePoint site also it increase high volume of data and security threats while such implementation like workflow really help to develop much abstracted information and service access layers and less resource consuming infrastructure artifacts. I analyze such implementing with workflow pattern and pilot application performance results, it has ability to shirk our transaction volume across BizTalk by 40% compare to previous and 50% better performance with easy to maintain requirement. These are not derived figures but these are my assumption on the standard scenario which I analyzed.

Hope you liked this article and please send your valuable feedback and ideas to