32-bit or 64-bit
The first thing to think about when building the end application is should it be a 32-bit or 64-bit application. Obviously, 64-bit is preferred – but there are many driving factors that will force 32-bit support to be utilized (even if running on an x64 operating system).
In short, the reason 32-bit mode would be forced is because of a piece of 32-bit legacy code running somewhere in the application. Common culprits would be older/3rd party applications such as external content management systems or search tools. Within the realm of Commerce Server, pipeline components will typically force 32-bit mode as many ISVs have not updated their components for x64.
If 32-bit mode is utilized, up to 3GB can be utilized per worker process. Memory consumption should be planned accordingly – along with separate worker processes to host multiple sites or applications if more than 3GB in aggregate is needed.
2-tier or 3-tier
When building a Commerce Server, one of the most oft-debated subjects is whether to use a 2-tier or 3-tier architecture. However, many customers desire to utilize a 3-tier architecture. The principal factors driving 3-tier are:
· Ability to physically separate the database tier
· Better abstraction from a code perspective
· Ability to support other business channels or external punch-outs (e.g. – A site running on another platform needs to use Commerce Server as a back-end.)
The practical reality is that a 3-tier architecture will almost always be slower from a performance capacity than a 2-tier architecture. If performance and hardware footprint are a paramount concern – then it should not be considered. If it is needed from a business requirements perspective – then it is a requirement. If it is indeed needed and performance is a concern, one might want to consider having both 2-tier and 3-tier code running on top of the same Commerce Server instance so as to only incur the 3-tier performance penalties for the scenarios where they are explicitly needed.
Building a 2-tier Site
In this section, the specific design considerations that need the most thought and analysis of architecting a 2-tier application with Commerce Server shall be identified. Given that a 2-tier application written utilizing Commerce Server is fundamentally an ASP.NET application at its core; all standard best practices of architecting an ASP.NET application apply. Given that this is a well known and understood problem space, ASP.NET best practices shall not be regurgitated here. Instead, refer to http://www.asp.net/ or http://msdn.microsoft.com/en-us/asp.net/default.aspx.
When architecting a 2-tier ASP.NET application using Commerce Server, by far the single most important architectural consideration is caching. This alone will have more of an impact upon performance than almost any other single consideration.
There are several types of caching applicable to a Commerce Server application built in ASP.NET, specifically:
· Profile Cache – This is an intrinsic cache of data within Commerce Server’s Profile system.
· Catalog Cache – This is an intrinsic cache of data with Commerce Server’s Catalog system.
· Pipeline Pool – This is an in memory pool of pre-instantiated Commerce Server Order Pipelines, aimed at reducing the call time to the Pipeline.
· ASP.NET Caches – At a developer level, both data and output can be cached utilizing the standard means within ASP.NET.
With that understanding in hand, one can then apply all of these throughout the construction of an ASP.NET application based upon Commerce Server 2007. In an attempt to simplify caching philosophy to something that can be easily understood and followed, the guiding principles are:
· Always use the Pipeline Pool – it will greatly reduce call time on pages calling Pipelines.
· The Profile Cache is always on – understand how it works and plan accordingly as it cannot be disabled.
· Always cache the Commerce Server context objects in ASP.NET – this will greatly reduce the time required to instantiate and call the Commerce Server APIs
· Balance Catalog Caching with ASP.NET output caching – otherwise the same data could end up being cached in multiple locations and wasting memory
To use the Pipeline Pool, all components must be free or neutral threaded – or written in .NET. If any component does not meet this criteria, it should be upgraded or replaced if at all possible as it will negate the ability to pool pipelines in entirety. Because no two pipeline configurations are the same, memory consumption should be profiled utilizing Performance Monitor and factored accordingly into the application’s memory requirements. The pool size should be adjusted based upon the likely number of pipelines that are likely to be active at one time – so as not to overtly waste memory. This can be determined by building a projected usage profile of the site and/or examining Web log files – simply look at the number of concurrent calls to pages or paths utilizing pipelines and set the pool size accordingly. Note: The pool size is a global setting so factor it across all pipelines in use in aggregate across the site.
The Profile Cache caches profile data in memory – and then does lazy committed writes to SQL Server as data is updated. It can potentially cause data corruption unless strict session affinity is maintained, as a user can get bounced from one server to another in a Web farm and receive inaccurate data since it might not have been written to disk.
As session affinity can be complex to establish and maintain, a better practice is to force-write profile data to the database every time it is updated. But this should be done sparingly – as it is an expensive operation (e.g. – don’t do it every time Profiles are touched – just upon updates). Then, there are no requirements for considerations around session affinity. The Profile cache size should be monitored in Performance Monitor to factor memory consumption requirements and then set accordingly as a configurable parameter.
The rest of the cacheable data on the site revolves around the Catalog. The absolute best practice here is to output cache at a page fragment/user control level. Output caching in general will provide the most superior performance for caching catalog data. However, it needs to be done at a page fragment/user control level versus a page level – or other things like Profile authentication might also get cached and cause the wrong behavior to occur.
With respect to output caching, one must consider that the site’s memory consumption will grow proportionate to the size of the catalog and how much of it will actually be browsed. This is the significantly complicating factor in leveraging a strategy of output caching and represents the “price” that comes with the performance – as there is no easy or simple way to model and factor the usage considerations. This will need to be planned based upon a calculated usage profile or analysis of actual Web logs. The affects of cache flushing, cache warming, or insufficient memory to cache everything must be factored based upon the projected usage. Remediation steps may need to be taken (e.g. – scripted cache warm-up when resetting the entire Web farm, rolling updates, cache resets when staging data, etc.) to ensure proper behavior.
Alternatively, with output caching, one must set expirations accordingly to ensure data is appropriately fresh. This is why utilizing fragments is important – as data that must be fresher by necessity (such as Inventory) can have a lower expiration value versus more general catalog data, which can generally suffice with a higher expiration value.
Writing ASP.NET code for extensibility and best practices – versus performance – are two different arts. A simple site that is lean and mean will perform best, but offer the worst extensibility and maintainability. And vice versa.
In general, if performance is of principal concern – outside of the caching practices described in the preceding section, consider:
· Do not use session state or view state
· Avoid utilizing the ASP.NET intrinsic objects such as the form controls
· Generally avoid many design patterns (such as UIP or Web Client Software Factories) and code strictly to the actual functional requirements
Building an international-ready Commerce Server application is a mix of interesting considerations, largely because Commerce Server is not fully internationalized itself. A summary of the core product’s language capabilities is as follows:
· Profile: Single language, but any locale supported by the underlying data store is supported.
· Marketing: Single language, but any locale supported by SQL Server is supported.
· Catalog: Multi-lingual (e.g. – you can have French and English versions of the same attribute in the same record/row in the Catalog) in any locale supported by SQL Server.
· Inventory: Multi-lingual, as it relies upon the Catalog system.
· Orders: Single language, but any locale supported by SQL Server is supported.
The Catalog, as noted, is single currency by default. Separate fields will be needed to support multiple currencies, or a lookup table can be utilized to convert across exchange rates.
Building an International ASP.NET-based site using Commerce Server, with the aforementioned understanding, can be done by observing the following considerations:
· When building the ASP.NET site, utilize resource files, thereby separating page content from page design from page code.
· Make sure that all labels that must surround the Profile and Order data are stored in resources at the ASP.NET level – this way only the form data is going back and forth to the database and no presentation tier elements are affected.
· Configure the Catalog as multi-lingual and retrieve data based upon the desired locale – matching it to the ASP.NET site locale.
· If multiple currencies are required, the best practice is to have one currency field per required currency and set explicit pricing. This way the variations caused by managing exchange rate tables can be overcome (not that many businesses utilize this for international pricing regardless). Then pull the currency based upon the ASP.NET site’s locale.
· For Marketing, the situation is somewhat more challenging because it does not support multi-lingual data – and there are many display attributes such as showing discounts in the basket, advertisements containing text content, etc. The best way to handle the situation is to set custom targeting criteria of locale, pass that value into custom targeting context from the site, and then pull marketing campaigns based upon locale as custom targeting criteria. This unfortunately requires that campaigns be duplicated in every locale supported by the site, but it does solve the problem gracefully from a display standpoint. Alternatively, one must work around the fact that only single language data can be stored and use placeholder data in the database, then resolve that to an appropriately localized resource string at runtime (which would require a fair bit of code and be semi-awkward to implement, arguably perhaps not more awkward than the first workaround proposed here).
There is nothing special about securing a Commerce Server application written in ASP.NET. It is essentially just another Web site. The critical considerations are ensuring that there are no injection attacks or other forms of malicious behavior possible through manipulation of the Web page. So, such things must be taken into account at the page design level as validating:
· Query strings
· Form posts
· HTTP Referrers (e.g. – make sure that only the correct preceding page (if in a sequence of pages) can call the current page)
· Cookies (to ensure that they cannot be easily forged)
Pages in general – including custom error pages or other exception handling sequences – should be made to be as non-CPU intensive as possible to handle a denial-of-service type attack of bulk HTTP requests.
As there is nothing specific about Commerce Server here, the general ASP.NET security best practices apply very well.
If the Commerce Server Analytics system is to be utilized, several points of consideration must be thought about in the site design process. By default, Commerce Server Analytics pulls data from two sources – the Commerce Server databases themselves as well as the IIS Weblogs. SQL Server DTS tasks are utilized for importation of both sets of data.
If no schema customizations have been made outside of what is possible within the Commerce Server tools, then the default DTS tasks for database-based data can be utilized. If yes, then additional DTS tasks must be created and the data warehouse schema extended to facilitate importation of the customized data.
With respect to Weblogs, they are relatively easy to import – but they tell only a small portion of the story. To properly tell how a user is navigating a site, additional data points must be collected based upon specific user actions and so forth. Commerce Server’s Analytics system facilitates this through the use of Custom Events. These can be dropped into any ASP.NET page programmatically – and they will capture/log additional data as defined/collected for use in the Analytics system. These do, however, slow down page processing – and should be used with requisite care so as to not adversely affect site performance.
From a reporting perspective, no data in custom events or in custom DTS tasks will be truly integrated into the out of box reports. So, to fully leverage the newly collected data – report customizations must be made and should be factored into the overall project accordingly.
Building a 3-tier Site
Commerce Server was not designed to be a 3-tier application – at least not at the time the product was originally created in 1998-1999 (for Commerce Server 2000). However, in today’s world – 3-tier applications are inescapable.
Commerce Server 2002 Feature Pack 1 and Commerce Server 2007 debut ASP.NET (ASMX)-based Web Services. These are intended for data management only – and are utilized internally by the BizTalk Adapters and Business User tools. They are fully documented and would be appropriate for use as-is or customized for back-end data update scenarios, such as connecting with a SOA-broker different than BizTalk.
They were not, however, designed for building a runtime site. In fact, there will be many API gaps present – as only the data management aspects of the overall Commerce Server API capabilities are represented. And those that are represented are optimized for data management, and not runtime scenarios.
Building a 3-tier Web site to enable such scenarios as the following:
· Communication from other systems in a punch-out scenario, such as Site A allowing checkouts to be processed by Site B
· Enablement of other devices/channels, such as handheld, mobility, kiosks, etc.
To build a 3-tier front-end Web site, the following considerations should be examined:
· Choice of technology – the obvious choices would be ASP.NET Web Services (ASMX) or Windows Communications Foundation (WCF). Given future technology direction, a WCF-based solution will likely have far better longevity (in terms of future technology evolution) than a solution based on ASMX
· Commerce Server – All of the standard considerations from building a 2-tier ASP.NET Web site also apply equally to building a 3-tier Web site, regardless of whether or not the technology choice is ASMX or WCF.
· Service Design – This is probably the most critical piece. Web service calls are inherently quite expensive. Hence, they should be consolidated as best as possible. “Chunky” calls will always perform better than “chatty” calls. Unfortunately, Commerce Server’s API (and that of many other database-driven applications) assumes lots of small calls versus fewer larger ones. Thus, the onus becomes shifted to the developer to – within his or her own Web service design – encapsulate the Commerce Server calls to behave well in a batch mode accordingly. This will likely require a substantial amount of application level data caching at the server (and possibly session affinity) – and should be planned for accordingly in terms of both physical deployment as well as memory utilization.
Given that there is nothing Commerce Server-specific about building Web services utilizing Windows Communication Foundation, general best practices apply. For more information, please refer to http://msdn.microsoft.com/en-us/library/ms735119.aspx, which will be a good set of resources for building WCF Web Services in general – including those utilizing Commerce Server.